homehome Home chatchat Notifications


A LiDAR Robot Might Just Be the Future of Small-Scale Agriculture

Robots usually love big, open fields — but most farms are small and chaotic.

Alexandra Gerea
April 29, 2025 @ 7:55 pm

share Share

Farming robot picking strawberries
A farm robot using lidar shows it can harvest strawberries from a high-bed cultivation field. Image credits: Osaka Metropolitan University.

Robots love big farms. But most farms aren’t big. According to one estimate, 72% of the world’s farms are smaller than one hectare, and that poses big problems for automation.

These smaller farms often face a labor crunch and rising production costs. They could also use automation just as badly as large farms, but most current robots are too bulky, too expensive, or too blind to work in tight quarters. So, Osaka Metropolitan University Assistant Professor Takuya Fujinaga developed a new robot that could make all the difference.

The new robot is small, boxy, and slow. It doesn’t look like the future. But with its ability to manage small spaces and greenhouses with LiDAR, it might just be what small farmers need.

What’s a robot to do?

In large-scale agriculture, the geometry of the field is usually input into the system. Then, the tractor or drone or robot has a pre-mapped path that it can adapt to using simple geometrical feedback. This doesn’t work in greenhouse farms — especially those using “high-bed cultivation,” where crops like strawberries grow in raised rows. First, the space is tight. Second, localization tools like GPS don’t work inside.

Fujinaga’s robot blends two simple ideas. But together, these simple ideas unlock a whole new level of agility in tight farming spaces. The first is waypoint navigation — a classic robotics approach where the bot moves from one set location to another, like walking from point A to point B on a map. This is what most autonomous vehicles rely on: get the coordinates, plan the path, follow it.

The second idea is bed navigation, and this is where things get clever. Instead of trying to know exactly where it is on a big-picture map, the robot focuses on its immediate surroundings — specifically, the rows of raised cultivation beds next to it. Using a simple LiDAR sensor, it constantly scans the beds’ edges and adjusts its position and angle to stay parallel and centered, like threading a needle through fabric. It doesn’t care where it is on the farm; it just knows it’s in a row and needs to stay on track.

LiDAR For Robot Farmers

For this task, it’s the LiDAR that does all the magic. LiDAR — short for light detection and ranging — is a sensing technology that uses laser pulses to measure distances with high precision. By firing thousands of tiny laser beams and timing how long they take to bounce back, a lidar unit builds a 2D or 3D map of the surrounding environment. In robotics and for self-driving cars it acts like a superhuman eye, detecting objects and obstacles in real-time.

LiDAR image of trees ina forest
LiDAR has been used to map forests for over a decade. It could soon be used in farms as well. Image credits: Oregon State University.

For Fujinaga’s greenhouse robot, this is the critical part. GPS doesn’t work indoors and the environment — rows of nearly identical strawberry beds — doesn’t offer enough visual variety for camera-based systems to navigate reliably. LiDAR gives the robot a way to “feel” its environment through geometry alone, allowing it to track the edges of cultivation beds, stay aligned, and adapt instantly to changing layouts or drifting plastic sheets.

“If robots can move around the farm more precisely, the range of tasks that they can perform automatically will expand, not only for harvesting, but also for monitoring for disease and pruning,” Professor Fujinaga explained. “My research shows a possibility, and once this type of agricultural robot becomes more practical to use, it will make a significant contribution to improving work efficiency and reducing labor, especially for high-bed cultivation.”

Could This Become Widespread?

Strawberries in a dish with a greenhouse full of strawberry plants in the background

Fujinaga’s robot isn’t flashy. It doesn’t use deep learning or multi-modal sensor fusion. But what it does offer is something very important in agricultural automation: practicality. The robot’s ability to switch between general waypoint navigation and tight-row feedback control makes it adaptable to real-world greenhouses, where conditions change constantly, and no two farms look exactly the same.

The system was tested both in simulation and in a real greenhouse filled with strawberries. In both environments, the robot consistently stayed within ±5 centimeters of the target distance from the beds. It’s also relatively cheap. Although it’s still prototype, there’s not much inside it that’s inherently expensive.

Of course, real farms are varied and difficult to operate in. Fujinaga has his sights set on making the simulations even more realistic — adding dynamic environments, variable lighting, shifting ground textures. The goal is to bring digital farming twins closer to the mess of the real world, so that future robots can be trained in simulations that feel just as chaotic as an actual day on the farm.

Because when it comes to agricultural robots, the question isn’t just “Can it work?”, it’s “Can it work anywhere?’

The study was published in Computers and Electronics in Agriculture.

share Share

Scientists put nanotattoos on frozen tardigrades and that could be a big deal

Tardigrades just got cooler.

This underwater eruption sent gravitational ripples to the edge of the atmosphere

The colossal Tonga eruption didn’t just shake the seas — it sent shockwaves into space.

50 years later, Vietnam’s environment still bears the scars of war – and signals a dark future for Gaza and Ukraine

When the Vietnam War finally ended on April 30, 1975, it left behind a landscape scarred with environmental damage. Vast stretches of coastal mangroves, once housing rich stocks of fish and birds, lay in ruins. Forests that had boasted hundreds of species were reduced to dried-out fragments, overgrown with invasive grasses. The term “ecocide” had […]

America’s Cornfields Could Power the Future—With Solar Panels, Not Ethanol

Small solar farms could deliver big ecological and energy benefits, researchers find.

Plants and Vegetables Can Breathe In Microplastics Through Their Leaves and It Is Already in the Food We Eat

Leaves absorb airborne microplastics, offering a new route into the food chain.

Explorers Find a Vintage Car Aboard a WWII Shipwreck—and No One Knows How It Got There

NOAA researchers—and the internet—are on the hunt to solve the mystery of how it got there.

Teen Influencer Watches Her Bionic Hand Crawl Across a Table on Its Own

The future of prosthetics is no longer science fiction.

Meet the Indian Teen Who Can Add 100 Numbers in 30 Second and Broke 6 Guinness World Records for Mental Math

The Indian teenager is officially the world's fastest "human calculator".

NASA Captured a Supersonic Jet Breaking the Sound Barrier and the Image Is Unreal

The coolest thing about this flight is that there was no sonic boom.

NASA’s Curiosity Rover Spotted Driving Across Mars From Space for the First Time

An orbiter captured Curiosity mid-drive on the Red Planet.