Google has released robots to farmer fields for crop check

Google has released robots to farmer fields for crop check

Google has presented prototypes of robots, which can check separate plants in the field to help farmers to raise productivity. Baggy-robots slide on fields on vertical support, therefore they can move on numbers between plants, without damaging them. The working out purpose consists in collecting as much as possible the exact data about efficiency of techniques of cultivation of agricultural crops.

«We hope that more perfect tools will allow agricultural branch to improve production technologies of foodstuff» – head of the Project Mineral of Elliott Grant tells in his message. The project mission is aimed at satisfaction of growing requirement of the world in the foodstuffs and maintenance of sustainable development of agro sector.

«And what, if we could confuse genetic and ecological factors of productivity of agricultural crops? We want to observe each plant and to give it a necessary food», – the Grant told.

Though farmers can have information on structure of soil or weather, the robot-baggy has been developed to see, how plants «actually grow and react to environment».

«For the last years the plant baggy travelled about on strawberry fields in California and to soya plantations in Illinois, collecting high-quality images of each plant, counting up and classifying each berry and each bean», – as told in the message.

Besides that it is the exact counter of fruits, the baggy can write down the information on their size, height of plants and the area of leaves. And all this data is connected to system of machine training to try to reveal laws and to generate the ideas useful to farmers.

Yan Drew, the founder and the chairman of the technological company, farmer who is engaged in breeding sheep, told that by means of robots on fields it is possible to check up presence of errors, to be convinced that the crop has been collected and planted during optimum time, and even to collect weeds.

ORIENT news

0