hh.sePublications
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
A vision based row-following system for agricultural field machinery
Halmstad University, School of Information Science, Computer and Electrical Engineering (IDE), Halmstad Embedded and Intelligent Systems Research (EIS).
Halmstad University, School of Information Science, Computer and Electrical Engineering (IDE), Halmstad Embedded and Intelligent Systems Research (EIS).
2005 (English)In: Mechatronics (Oxford), ISSN 0957-4158, E-ISSN 1873-4006, Vol. 15, no 2, p. 251-269Article in journal (Refereed) Published
Abstract [en]

In the future, mobile robots will most probably navigate through the fields autonomously to perform different kind of agricultural operations. As most crops are cultivated in rows, an important step towards this long-term goal is the development of a row-recognition system, which will allow a robot to accurately follow a row of plants. In this paper we describe a new method for robust recognition of plant rows based on the Hough transform. Our method adapts to the size of plants, is able to fuse information coming from two rows or more and is very robust against the presence of many weeds. The accuracy of the position estimation relative to the row proved to be good with a standard deviation between 0.6 and 1.2 cm depending on the plant size. The system has been tested on both an inter-row cultivator and a mobile robot. Extensive field tests have showed that the system is sufficiently accurate and fast to control the cultivator and the mobile robot in a closed-loop fashion with a standard deviation of the position of 2.7 and 2.3 cm, respectively. The vision system is also able to detect exceptional situations by itself, for example the occurrence of the end of a row.

Place, publisher, year, edition, pages
Amsterdam: Elsevier, 2005. Vol. 15, no 2, p. 251-269
Keywords [en]
Agricultural robots, Row-following, Vision-guidance, Crop-row location, Hough transforms
National Category
Robotics
Identifiers
URN: urn:nbn:se:hh:diva-239DOI: 10.1016/j.mechatronics.2004.05.005ISI: 000226871200007Scopus ID: 2-s2.0-12244301226Local ID: 2082/534OAI: oai:DiVA.org:hh-239DiVA, id: diva2:237417
Available from: 2006-11-24 Created: 2006-11-24 Last updated: 2018-03-23Bibliographically approved
In thesis
1. Vision Based Perception for Mechatronic Weed Control
Open this publication in new window or tab >>Vision Based Perception for Mechatronic Weed Control
2005 (English)Doctoral thesis, comprehensive summary (Other academic)
Abstract [en]

The use of computer-based signal processing and sensor technology to guide and control different types of agricultural field implements increases the performance of traditional implements and even makes it possible to create new ones. This thesis increases the knowledge on vision-based perception for mechatronic weed control. The contributions are of four different kinds:

First, a vision-based system for row guidance of agricultural field machinery has been proposed. The system uses a novel method, based on the Hough transform, for row recognition of crop rows.

Second is a proposal for a vision-based perception system to discriminate between crops and weeds, using images from real situations in the field. Most crops are cultivated in rows and sown in a defined pattern, i.e. with a constant inter-plant distance. The proposed method introduces the concept of using these geometrical properties of the scene (context) for single plant recognition and localization. A mathematical model of a crop row has been derived that models the probability for the positions of consecutive crops in a row. Based on this mathematical model two novel methods for context-based classification between crops and weeds have been developed. Furthermore, a novel method that combines geometrical features of the scene (context) and individual plant features has been proposed. The method has been evaluated in two datasets of images of sugar beet rows. The classification rate was 92 % and 98 %, respectively.

The third contribution is the design of a mobile agricultural robot equipped with these perception systems and a mechanical weeding tool intended for intra-row weed control in ecologically cultivated crops.

The fourth contribution is a demonstration of the feasibility of the perception systems in real field environments, especially with respect to robustness and real-time performance. The row guidance system has been implemented in three different row cultivators and performed inter-row weed control at two commercial farms. The robot has proven to be able to follow a row structure by itself, while performing weed control within the seed line of a crop row, i.e. intra-row cultivation. 

Place, publisher, year, edition, pages
Göteborg: Chalmers tekniska högskola, 2005. p. 44
Series
Doktorsavhandlingar vid Chalmers tekniska högskola. Ny serie, ISSN 0346-718X ; 2328
Keywords
Vision-based perception, Plant recognition, Row following, Weed Control, Mechatronics in agriculture, Mobile robots
National Category
Engineering and Technology
Identifiers
urn:nbn:se:hh:diva-694 (URN)2082/1039 (Local ID)91-7291-646-X (ISBN)2082/1039 (Archive number)2082/1039 (OAI)
Public defence
2005-09-08, Wigforssalen, Högskolan i Halmstad, Kristian IV:s väg 3, Halmstad, 13:15 (English)
Opponent
Note

Även i serie: Technical report. D (Department of Computer Science and Engineering, Chalmers University of Technology), 1653-1787 ; 9

Ytterligare delarbeten:

Åstrand, B., Baerveldt, A.-J., Plant recognition and localization using context information, Proceedings of the IEEE Conference Mechatronics and Robotics 2004 – special session Autonomous Machines in Agriculture, Aachen, Germany, September 13-15, pp. 1191-1196, (2004).

Åstrand, B., Baerveldt, A.-J., Plant recognition and localization using context information and individual plant features, Submitted to International Journal of Pattern Recognition and Artificial Intelligence, June 2005.

Available from: 2007-05-28 Created: 2007-05-28 Last updated: 2018-03-23Bibliographically approved

Open Access in DiVA

No full text in DiVA

Other links

Publisher's full textScopus

Authority records BETA

Åstrand, BjörnBaerveldt, Albert-Jan

Search in DiVA

By author/editor
Åstrand, BjörnBaerveldt, Albert-Jan
By organisation
Halmstad Embedded and Intelligent Systems Research (EIS)
In the same journal
Mechatronics (Oxford)
Robotics

Search outside of DiVA

GoogleGoogle Scholar

doi
urn-nbn

Altmetric score

doi
urn-nbn
Total: 5901 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf