hckrnws
Launch HN: OctaPulse (YC W26) – Robotics and computer vision for fish farming
by rohxnsxngh
Hi HN! My name is Rohan and, together with Paul, I’m the co-founder of OctaPulse (https://www.tryoctapulse.com/). We’re building a robotics layer for seafood production, starting with automated fish inspection. We are currently deployed at our first production site with the largest trout producer in North America.
You might be wondering how the heck we got into this with no background in aquaculture or the ocean industry. We are both from coastal communities. I am from Goa, India and Paul is from Malta and Puerto Rico. Seafood is deeply tied to both our cultures and communities. We saw firsthand the damage being done to our oceans and how wild fish stocks are being fished to near extinction. We also learned that fish is the main protein source for almost 55% of the world's population. Despite it not being huge consumption in America it is massive globally. And then we found out that America imports 90% of its seafood. What? That felt absurd. That was the initial motivation for starting this company.
Paul and I met at an entrepreneurship happy hour at CMU. We met to talk about ocean tech. It went on for three hours. I was drawn to building in the ocean because it is one of the hardest engineering domains out there. Paul had been researching aquaculture for months and kept finding the same thing: a $350B global industry with less data visibility than a warehouse. After that conversation we knew we wanted to work on this together.
Hatcheries, the early stage on-land part of production, are full of labor intensive workflows that are perfect candidates for automation. Farmers need to measure their stock for feeding, breeding, and harvest decisions but fish are underwater and get stressed when handled. Most farms still sample manually. They net a few dozen fish, anesthetize them, place them on a table to measure one by one, and extrapolate to populations of hundreds of thousands. It takes about 5 minutes per fish and the data is sparse.
When we saw this process we were baffled. There had to be a better way. This was the starting point that really kicked us off.
Here is the thing though. Most robots are not built to handle humid and wet environments. Salt water is the enemy of anything mechanical. Corrosion is such a pain to deal with. Don't get me started on underwater computer vision which has to parse through water turbidity and particles. Fish move unpredictably and deform while swimming. Occlusion is constant. Calibration is tricky in uncontrolled setups. Handling live fish with robotics is another challenge that hasn't really been solved before. Fish are slippery, fragile, and stress easily. All of this is coupled with the requirement that all materials must be food safe.
On the vision side we are using Luxonis OAK cameras which give us depth plus RGB in a compact form factor. The onboard Myriad X VPU lets us run lightweight inference directly on the camera for things like detection and tracking without needing to send raw frames over USB constantly. For heavier workloads like segmentation and keypoint extraction we bump up to Nvidia Jetsons. We have tested on the Orin Nano and Orin NX depending on power and thermal constraints at different sites.
The models themselves are CNN and transformer based architectures. We are running YOLO variants for detection, custom segmentation heads for body outlines, and keypoint models for anatomical landmarks. The tricky part is getting these to run fast enough on edge hardware. We are using a mix of TensorRT, OpenVINO, and ONNX Runtime depending on the deployment target. Quantization has been a whole journey. INT8 quantization on TensorRT gives us the speed we need but you have to be careful about accuracy degradation especially on the segmentation outputs where boundary precision matters. We spent a lot of time building calibration datasets that actually represent the variance we see on farms. Lighting changes throughout the day, water clarity shifts, fish density varies. Your calibration set needs to capture all of that or your quantized model falls apart in production.
There is no wifi at most of these farms so we are using Starlink for connectivity in remote or offshore locations. Everything runs locally first and syncs when connection is available. We are not streaming video to the cloud. All inference happens on device.
Behind the scenes we have been building our own internal tooling for labeling, task assignment, and model management. Early on we tried existing labeling platforms but they did not fit our workflow. We needed tight integration between labeling, training pipelines, and deployment. So we built our own system where we can assign labeling tasks to annotators, track progress, version datasets, and push models to edge devices with a single command. It is not fancy but it keeps everything under our control and makes iteration fast. When you are trying to close the loop between data collection on farm, labeling, training, quantization, and deployment you cannot afford to have fragmented tooling. We needed one system that handles all of it.
On the robotics side we are building custom enclosures around off the shelf components and modifying delta robots with soft robotics grippers for handling. Vacuum and typical gripper actuation will not work in this environment so we are using compliant grippers that can safely handle fish without damaging them. We started with the Delta X S as our test platform and are evaluating whether to move to industrial delta robots or build our own from scratch once we validate the kinematics and payload requirements in wet and humid environments. The end effector design is still evolving. Fish come in different sizes and body shapes depending on species and life stage so we need grippers that can adapt.
Right now we are focused on operations outside the water. Hatchery phenotyping, sorting, quality inspection. These are more accessible than full underwater deployment and cheaper to start with. The idea is that if we can combine genetics data, environmental data, and phenotypic imagery we can help farms identify which fish to breed and which to cull. This is where selective breeding starts.
Something that surprised us early on: only a tiny fraction of farmed fish species have been through genetic improvement programs. Chickens grow 4x faster than they did in 1950 because of decades of selective breeding. But most farmed fish are essentially wild genetics. The opportunity to improve aquaculture genetics is massive but it is completely bottlenecked on measurement. You cannot improve what you cannot measure, and farms can barely measure anything at scale so far.
The industry moves on trust though. We are dealing with live animals and farms are cautious about who they let near their stock. Coming from outside aquaculture, that trust had to be earned. Paul was already a Future Leader with the Coalition for Sustainable Aquaculture but the real turning point was attending World Aquaculture Society, the largest conference in the US. Through a connection of a connection he met the incoming lead geneticist at what became our first customer. That relationship turned into a paid pilot with the largest trout producer in North America.
I previously worked at ASML, Nvidia, Tesla, and Toyota. Paul worked at Bloomberg. We met at CMU and immediately knew that we wanted to tackle this problem and put our life's work into this.
We would love feedback from any of you who have worked on computer vision in harsh or unpredictable environments, edge deployment on constrained hardware, or gentle and appropriate handling of live animals with robotics. If you are running inference on Jetsons or OAK cameras and have opinions on quantization workflows we would love to hear what has worked for you. If you have aquaculture experience we are curious what problems we should be thinking about that we haven't encountered yet.
Dang told us you’re all used to demo videos but unfortunately we can’t share them due to NDAs. But here’s a photo of us building our initial dataset for phenotyping and morphometric analysis: https://drive.google.com/file/d/1z3oSlB8ed9hanrybzP24XTfjDJE....
This is a weird industry to be building in and we are learning something new every week. If you have experience with edge deployment, robotics in wet environments, or aquaculture itself we would love to hear your perspective. And if you just have questions about fish or the tech we are happy to go deep in the comments. Excited to hear what this community thinks.
Impressive engineering, genuinely. But I'd push back on the framing that scaling aquaculture is straightforwardly good.
Fish sentience is increasingly well-supported in the neuroscience literature. We already kill somewhere around 1-2 TRILLION fish annually... a number that dwarfs land animal slaughter yet attracts almost no ethical scrutiny. Optimising and scaling that system is worth examining carefully.
The part of your website that says "land can't feed 10B people, wild fisheries are maxed out, therefore aquaculture" also quietly ignores plant-based protein, which is more land-efficient and doesn't require instrumentalising sentient animals at industrial scale.
I'm not saying the engineering problems aren't interesting. They clearly are. But at 1-2 trillion deaths per year, this is the largest scale of animal killing in human history, and we're building better tools to do more of it.
This is a thoughtful critique and I appreciate you raising it directly. You are right that fish sentience is getting more attention in the literature and it deserves more ethical scrutiny than it currently receives. We do not take the position that scaling aquaculture is straightforwardly good without tradeoffs. There are real welfare concerns and the industry has not always handled them well.
A few thoughts on where we stand:
On welfare specifically, our technology actually reduces stress on fish compared to the current manual process. Traditional phenotyping involves netting, anesthesia, and physical handling. Our system measures fish without any of that. Less handling means lower cortisol, lower mortality, and healthier animals. We are not neutral on welfare. We think better measurement tools should lead to better treatment, not just faster growth.
On plant based protein, you are right that our framing glosses over it. Plant based is more efficient on land use and we are not arguing against it. The reality is that billions of people rely on fish as their primary protein source today and that is not going to change overnight. We are trying to make the aquaculture that already exists more sustainable and humane, not argue that it is the only path forward.
On the scale of killing, I do not have a good answer for that. It is a massive number and I understand why that gives you pause. What I can say is that if aquaculture is going to exist at scale, we would rather it be done with better data, less stress on the animals, and more intentional breeding practices than the status quo.
Interesting to see CV applied to aquaculture — this is one of those domains where the ROI on automation is enormous but underexplored because the talent pipeline skews toward adtech and fintech.
One thing I'd push on: how are you handling distribution shift from turbidity and lighting variation across facilities? In my experience deploying vision models in non-controlled environments (industrial, not fish specifically), the gap between lab accuracy and production accuracy is almost entirely driven by domain shift in image quality. Continuous calibration pipelines — where you flag low-confidence predictions for human review and retrain on the corrected labels — tend to matter more than the initial model architecture choice.
Also curious about the welfare angle that came up in the thread. Selective breeding guided by phenotype scoring has obvious parallels to the poultry industry's problematic optimization for growth rate. Are you building in any multi-objective constraints (e.g., health markers alongside growth metrics) to avoid that failure mode?
We are doing exactly what you described with continuous calibration. We have essentially built our own in-house labeling, ingesting, and task assignment software for these tasks. Low confidence predictions get flagged for review, corrected labels feed back into training, and we retrain on a rolling basis. We also stratify our calibration datasets intentionally by time of day, tank conditions, and fish density rather than just grabbing random frames. Early on our datasets were too homogenous and the models would work great in testing then degrade in production. The architecture matters less than having a tight feedback loop between deployment and retraining.
On the welfare angle, yes we are thinking about this carefully. The data we collect includes body shape, fin integrity, spinal curvature, and other morphological traits that are signals of fish health and robustness, not just growth rate. Farms that care about sustainability can use this to select for fish that are healthy and resilient rather than just fast growing. The tool is neutral but the selection criteria are up to the breeder. We do not want to enable the same failure mode that happened with poultry.
The talent pipeline point is interesting too. You are right that most CV talent ends up in adtech or fintech. We have found that people get excited about working on something physical and tangible once they realize the problems are just as hard.
I like your project. I'm a SWE working on something completely different by day but tinkering with several other side projects, including this, https://github.com/ratsbane/panda-mcp/ (which is accelerating my computer vision learning rapidly) I'm interested in your choice of the OAK cameras because I'm trying to select a camera model also. I started this with a simple USB webcam and added a PhotoNeo camera. I've just switched to RealSense and I've got a couple of ZED cameras on order but now I'm looking at the OAK-D and thinking that it makes a lot of sense to run a nn directly on the camera.
well these days inference can be done on CPUs depending on the size of a model. the OAK-D camera's especially are literally just computer. They are linux and you can SSH directly into them which is insane. The camera itself is shaped like a heat sink to dissipate heat. it's honestly phenomenal. I talked to the CEO of the company as well and I'm really impressed with their machine and supply chain. so all in all, high hopes for them.
As a home aquaponics grower, I am really interested in the opportunity to develop tools that help this industry grow smarter. The impact to open-water fisheries can be undone if the markets can be affected to appreciate farm-raised fish for their quality.
I think there is such an incredible opportunity in the sector, and it probably looks a lot like any of the other sectors that have been augmented by data - gather giant piles of any measurable detail, and hope that after filtering you see a pattern that doesn't depend on your production environment running as many sensors ( or tensors ).
Last Thought: Fish transfer pumps are not only a thing, but one of the best ways to have the whole pond population march past your camera in a lighting environment where you have more control.
https://www.miprcorp.com/fish-pumping/ - just one example with decent pictures
This is a great comment. You are absolutely right about the data opportunity. The industry is so data sparse right now that even basic measurements at scale would be a step change. We are seeing that firsthand with our customer. They went from sampling a few dozen fish by hand to continuous measurement and the insights are already compounding.
Thank you for the fish pump link. We have looked at pump based systems as a way to create controlled measurement environments. You get consistent lighting, predictable fish orientation, and the fish are already moving through a constrained path. The challenge is you are still dealing with water turbidity, particulates, and bubbles in the flow which can mess with imaging. It is better than open water but not a free pass on the vision problems.
We have also been looking at pescalators which use an Archimedes screw design to lift fish out of the water. Some setups combine this with anesthetization for operations that require handling. The tradeoff is you are adding stress and complexity but you get a much cleaner imaging environment. There is no single right answer here and the best approach depends on the species, life stage, and what you are trying to measure. This is definitely technology that will develop over time as the industry matures.
What species are you working with in your aquaponics setup?
Tilapia, because the grow-out plan is very well documented. I'd happily sacrifice growth rate for a fish with higher "desirability" factor, and perhaps a lower optimal temperature. I previously tried Bluegill and lost them, I think, due to stress from temperature variation. I'd like to try them again or go with Catfish. Catfish are the top species (for food, by weight) produced in the US, and they seem nearly as durable as Tilapia in small systems.
The pescalators sound great. There are so many tools like that where the application specifics ( species, system, life stage ) could make room for a scalpel-precise optimization of some tool, but the benefits would have to come from scale, and there just haven't been many first-movers ( or they keep quiet and defend the moat ) who seem poised to raise the tide for the whole industry. It is very ripe for the work you are doing to help the downstream gains over generations of stocks.
Cheers to you guys!
Tilapia is a great species and the resilience is impressive. We have not started working with tilapia yet but love that it is one of the best species being grown in developing countries due to ability to thrive in warm and turbid water.
> Something that surprised us early on: only a tiny fraction of farmed fish species have been through genetic improvement programs. Chickens grow 4x faster than they did in 1950 because of decades of selective breeding.
I agree that there is an opportunity here for getting more calories per fish (and especially per input of feed, which is really what decades of chicken optimization are about). But the consequences of these changes for chicken welfare have been disastrous [0] and we're seeing a concerted effort to move to higher-welfare breeds (though still more efficient than ancestral breeds). Likewise, intensive salmon farming has led to widespread '“environmental dewilding,” or the process of modifying natural water bodies with artificial infrastructure — in this case, fish farm pens and cages — and polluting them' [1]. It sounds like there are lots of ways in which using more robots can make monitoring less-invasive, and therefore less stressful for fish. I certainly hope to see those attributes, rather than the potentially disastrous ones, emphasized as you move forward.
[0] https://www.ciwf.org/programmes/better-chicken/
[1] https://www.vox.com/future-perfect/468348/atlantic-salmon-fa...
This is a really important point and something we think about a lot. You are absolutely right that chicken optimization has come with serious welfare tradeoffs. Breeding purely for growth rate without considering the animal's ability to actually live comfortably is how you end up with birds that cannot walk properly. We do not want to enable that trajectory for fish.
The good news is that the data we are collecting can be used to select for more than just growth. Body shape, fin integrity, spinal curvature, and other morphological traits are all signals of fish health and welfare. Farms that care about sustainability can use this data to breed fish that are robust and healthy, not just fast growing. The tool is neutral but the selection criteria are up to the breeder.
On the environmental side, our focus right now is on land based hatcheries and recirculating aquaculture systems rather than open net pens in the ocean. These closed systems avoid a lot of the dewilding and pollution concerns you mentioned. They are more expensive to operate but they keep farmed fish separate from wild populations and give you much more control over waste and water quality. And yes, reducing handling stress is a big part of what we are building. The manual process today involves netting, anesthesia, and physical manipulation. Our system can measure fish without any of that. Less stress on the animal and better data for the farmer.
This is Awesome.
On a different note, if you bring this up or think about India too, how will it impact manual farmers whose entire livelihood is tied to doing the job? Or am I reading it (automation) wrong?
Btw, I’ve never liked a website taking over my mouse pointer or the scroll UI and behavior. But yours is so well done, it is lovely, cute, and is indeed very fishy.
Great question. We are specifically targeting vertically integrated farms which are usually massive scale operations with hundreds of thousands to millions of fish. These facilities already have significant labor costs for trained technicians and geneticists doing manual phenotyping and sorting. We are not going after small scale manual farmers or trying to be an eFishery type business that serves smallholders.
The labor dynamics are also different at these large farms. The work we are automating is repetitive, physically demanding, and hard to staff consistently. Most of the farm managers we talk to are not trying to replace people, they are struggling to find enough workers willing to do this work in the first place. Automation at this scale tends to shift jobs rather than eliminate them entirely. And thank you for the kind words on the website! We have gotten mixed reviews on the cursor so enjoy it while it lasts lol
This is an awesome concept. Thanks for sharing.
Have you had any issues with turbidity so far?
Thanks! Yes turbidity has been one of our bigger challenges. Water clarity can shift dramatically throughout the day depending on feeding, fish activity, and weather. We have had to build our calibration datasets to capture that variance otherwise the quantized models degrade fast in production. We are also experimenting with different lighting setups to cut through particulate but it is still a work in progress :)
Let me know if you are hiring. I'm based in Norway and did my masters in AI computer vision (object detection, segmentation). Have also published a paper in medical imaging, as well as sold my AI webapp/SaaS for six figures $ some months ago (the startup was not in CV, but in transcription/LLM/media monitoring).
We are hiring, would love to chat if you have some time.
Great product!
I wonder how do you manage data labeling? Do you outsource it by using data label vendors or do you have something in-house?
Great question. We are building our entire labeling and data management system in house. Early on we tried existing platforms but they did not fit our workflow. We have a lot of video data and need custom labeling for things like keypoints, body outlines, and deformity classification that off the shelf tools do not handle well. Building it ourselves is cheaper at our scale, gives us tighter integration between labeling, training pipelines, and deployment, and lets us iterate faster. We can assign tasks to annotators, version datasets, and push models to edge devices from one system. When you are trying to close the loop between data collection on farm and deployment you cannot afford fragmented tooling.
This is awesome. Specially loved the origin story. Have been to Goa a couple times, and never have I ever thought about all these from this angle. Currently I am working with a startup incubator in North East India, and we provide grants of upto $30,000 and have seen some good aquaculture ideas. Hope I can introduce the cohort with you some day.
Aqua startups need each other.
Thank you! Goa has a way of making the ocean feel like part of everyday life. It is hard to grow up there and not think about fishing and seafood at some point.
Would love to connect with your cohort. North East India has a ton of freshwater aquaculture potential and we are always looking to learn from founders working in different geographies and species.
You are right that aqua startups need each other. The industry is so fragmented and underserved that collaboration makes more sense than competition at this stage.
this is cool, I used to work in ag robotics and understand the economics require bravery, and loads of project finance eventually?
one thing: the fish cursor on the site is frustrating pls allow disabling.
yes it's definitely capital intensive for sure. feedback on the website is noted, will make the changes on the next release.
Have you familiarized yourself with Whooshh Innovations? They have been operating in this space for over a decade and have solved many of these problems. It is an interesting space for sure! Best of luck!
Thank you!
Whoosh has really interesting tech more focused on the fish transport side with products that move fish from tank to tank while performing some operations.
Our initial focus with inspection is taking high quality images of fish to pull insights needed for maximizing efficiency and improving breeding programs. We have designed our system to easily drop-in to the current operations so it is seamless.
You are working in the wrong direction. There is not point into grabbing live fishes with a robotic hand. In fact, I would strongly advise against trying this approach in hatcheries.
We are usually interacting with anesthetized fish and farms are already doing manual tasks with this. We are also integrating camera's into existing workflow that do not handle the fish to improve a farms insight into their own farms.
There are many tasks within farms that are extremely manual that are a bottleneck, this is especially prevalent in offshore farms and marine species.
this is just an objectively cool application of technology and some hardcore engineering. nice stuff guys
thank you, appreciate it
Shinkei was for the rich; you guys make it for all
Shinkei definitely has cool tech! Aquaculture has already surpassed commercial fishing in terms of production and has become the cheapest source of protein in many countries. We are excited to help the industry grow even further.
> The math of feeding 10 billion people only works if we farm the ocean
Even for marketing puffery, "only" seems reductive when most resource usage seems specific to a few animal products like cows and lamb: https://ourworldindata.org/land-use-diets
The broader point we were trying to make is that as population grows and arable land decreases, the protein mix will need to shift. Fish have a feed conversion ratio of around 1.0 to 1.5 compared to 6 to 10 for beef. They do not require land for grazing and aquaculture has significantly lower greenhouse gas emissions than most terrestrial livestock. That does not mean aquaculture is the only path forward but it is likely a bigger part of the equation than it is today, especially in regions where seafood is already a primary protein source. We are not arguing against plant based diets or other solutions. Just that aquaculture is underinvested relative to its potential and the infrastructure to scale it efficiently does not exist yet. That is the gap we are trying to fill.
The fish cursor is cute, but extremely annoying.
I like it, but it should just be visual. It jacks my default scrollpad/mouse behavior, which is the annoying part.
Perhaps add a hook below the usual mouse real pointer and have a fish that is just a decoration that slowly swims to it.
^ this
interesting, I like the idea
yup I got similar feedback from other batchmatches, gonna fix this.
Agreed. Feels icky. Made me want to leave the page as quickly as possible again.
We got a little too excited about the fish theme. Noted for the next iteration.
^ no way this was tested by anyone with eyes before it was deployed
I am a fan of the fish cursor. We should make the internet quirky again. This is like a modern take on Geocities websites, and they should do things like fish cursors now while they still can before a board of VCs comes in and makes them remove the fish cursor.
Ha, thank you. We figured if we are building robots for fish we might as well commit to the bit. Enjoy it while it lasts.
[dead]
Comment was deleted :(
[dead]
[flagged]
Thanks! The quantization tradeoffs have been a grind. We do not have an exact number but we found that a few thousand images was not enough once you account for the variance on farm. Lighting changes throughout the day, water clarity shifts between feedings, fish density varies by tank. Early on our calibration sets were too homogenous and the INT8 models would work great in testing and then fall apart when conditions shifted.
We also found that segmentation required significantly fewer images compared to keypoint pose detection models. Segmentation generalizes faster since you are just finding body boundaries. Keypoints are more finicky because anatomical landmarks vary a lot more across species, life stages, and body deformation while swimming. We had to be much more intentional about diversity in the keypoint training data. What made the difference overall was building calibration sets that intentionally captured edge cases. Low light, high turbidity, dense occlusion, different life stages. We also started stratifying by time of day and tank conditions rather than just grabbing random frames. It is still not perfect but the models are much more stable now.
Don't you find transformer based models perform better? Are they too heavy?
they're heavy but we have some post processing tasks related to the smoothness of the countours created by the labels. sometimes CNNs do not have smooth segmentation. We do various calculations to determine deformities or welfare indicators on a fish. For that we need some smoother contours.
Given that, we have trained a variety of models and we are still experimenting on what works the best. We are even considering using VLMs for certain tasks if we can fine-tune them well enough.
Crafted by Rajat
Source Code