Hacker Neus
Hacker Neus
Hi HN! My name is Rohan and, together with Paul, I’m the co-founder of OctaPulse (https://www.tryoctapulse.com/). We’re building a robotics layer for seafood production, starting with automated fish inspection. We are currently deployed at our first production site with the largest trout producer in North America.

You might be wondering how the heck we got into this with no background in aquaculture or the ocean industry. We are both from coastal communities. I am from Goa, India and Paul is from Malta and Puerto Rico. Seafood is deeply tied to both our cultures and communities. We saw firsthand the damage being done to our oceans and how wild fish stocks are being fished to near extinction. We also learned that fish is the main protein source for almost 55% of the world's population. Despite it not being huge consumption in America it is massive globally. And then we found out that America imports 90% of its seafood. What? That felt absurd. That was the initial motivation for starting this company.

Paul and I met at an entrepreneurship happy hour at CMU. We met to talk about ocean tech. It went on for three hours. I was drawn to building in the ocean because it is one of the hardest engineering domains out there. Paul had been researching aquaculture for months and kept finding the same thing: a $350B global industry with less data visibility than a warehouse. After that conversation we knew we wanted to work on this together.

Hatcheries, the early stage on-land part of production, are full of labor intensive workflows that are perfect candidates for automation. Farmers need to measure their stock for feeding, breeding, and harvest decisions but fish are underwater and get stressed when handled. Most farms still sample manually. They net a few dozen fish, anesthetize them, place them on a table to measure one by one, and extrapolate to populations of hundreds of thousands. It takes about 5 minutes per fish and the data is sparse.

When we saw this process we were baffled. There had to be a better way. This was the starting point that really kicked us off.

Here is the thing though. Most robots are not built to handle humid and wet environments. Salt water is the enemy of anything mechanical. Corrosion is such a pain to deal with. Don't get me started on underwater computer vision which has to parse through water turbidity and particles. Fish move unpredictably and deform while swimming. Occlusion is constant. Calibration is tricky in uncontrolled setups. Handling live fish with robotics is another challenge that hasn't really been solved before. Fish are slippery, fragile, and stress easily. All of this is coupled with the requirement that all materials must be food safe.

On the vision side we are using Luxonis OAK cameras which give us depth plus RGB in a compact form factor. The onboard Myriad X VPU lets us run lightweight inference directly on the camera for things like detection and tracking without needing to send raw frames over USB constantly. For heavier workloads like segmentation and keypoint extraction we bump up to Nvidia Jetsons. We have tested on the Orin Nano and Orin NX depending on power and thermal constraints at different sites.

The models themselves are CNN and transformer based architectures. We are running YOLO variants for detection, custom segmentation heads for body outlines, and keypoint models for anatomical landmarks. The tricky part is getting these to run fast enough on edge hardware. We are using a mix of TensorRT, OpenVINO, and ONNX Runtime depending on the deployment target. Quantization has been a whole journey. INT8 quantization on TensorRT gives us the speed we need but you have to be careful about accuracy degradation especially on the segmentation outputs where boundary precision matters. We spent a lot of time building calibration datasets that actually represent the variance we see on farms. Lighting changes throughout the day, water clarity shifts, fish density varies. Your calibration set needs to capture all of that or your quantized model falls apart in production.

There is no wifi at most of these farms so we are using Starlink for connectivity in remote or offshore locations. Everything runs locally first and syncs when connection is available. We are not streaming video to the cloud. All inference happens on device.

Behind the scenes we have been building our own internal tooling for labeling, task assignment, and model management. Early on we tried existing labeling platforms but they did not fit our workflow. We needed tight integration between labeling, training pipelines, and deployment. So we built our own system where we can assign labeling tasks to annotators, track progress, version datasets, and push models to edge devices with a single command. It is not fancy but it keeps everything under our control and makes iteration fast. When you are trying to close the loop between data collection on farm, labeling, training, quantization, and deployment you cannot afford to have fragmented tooling. We needed one system that handles all of it.

On the robotics side we are building custom enclosures around off the shelf components and modifying delta robots with soft robotics grippers for handling. Vacuum and typical gripper actuation will not work in this environment so we are using compliant grippers that can safely handle fish without damaging them. We started with the Delta X S as our test platform and are evaluating whether to move to industrial delta robots or build our own from scratch once we validate the kinematics and payload requirements in wet and humid environments. The end effector design is still evolving. Fish come in different sizes and body shapes depending on species and life stage so we need grippers that can adapt.

Right now we are focused on operations outside the water. Hatchery phenotyping, sorting, quality inspection. These are more accessible than full underwater deployment and cheaper to start with. The idea is that if we can combine genetics data, environmental data, and phenotypic imagery we can help farms identify which fish to breed and which to cull. This is where selective breeding starts.

Something that surprised us early on: only a tiny fraction of farmed fish species have been through genetic improvement programs. Chickens grow 4x faster than they did in 1950 because of decades of selective breeding. But most farmed fish are essentially wild genetics. The opportunity to improve aquaculture genetics is massive but it is completely bottlenecked on measurement. You cannot improve what you cannot measure, and farms can barely measure anything at scale so far.

The industry moves on trust though. We are dealing with live animals and farms are cautious about who they let near their stock. Coming from outside aquaculture, that trust had to be earned. Paul was already a Future Leader with the Coalition for Sustainable Aquaculture but the real turning point was attending World Aquaculture Society, the largest conference in the US. Through a connection of a connection he met the incoming lead geneticist at what became our first customer. That relationship turned into a paid pilot with the largest trout producer in North America.

I previously worked at ASML, Nvidia, Tesla, and Toyota. Paul worked at Bloomberg. We met at CMU and immediately knew that we wanted to tackle this problem and put our life's work into this.

We would love feedback from any of you who have worked on computer vision in harsh or unpredictable environments, edge deployment on constrained hardware, or gentle and appropriate handling of live animals with robotics. If you are running inference on Jetsons or OAK cameras and have opinions on quantization workflows we would love to hear what has worked for you. If you have aquaculture experience we are curious what problems we should be thinking about that we haven't encountered yet.

Dang told us you’re all used to demo videos but unfortunately we can’t share them due to NDAs. But here’s a photo of us building our initial dataset for phenotyping and morphometric analysis: https://drive.google.com/file/d/1z3oSlB8ed9hanrybzP24XTfjDJE....

This is a weird industry to be building in and we are learning something new every week. If you have experience with edge deployment, robotics in wet environments, or aquaculture itself we would love to hear your perspective. And if you just have questions about fish or the tech we are happy to go deep in the comments. Excited to hear what this community thinks.