hh.sePublications
Change search
Refine search result
1 - 3 of 3
CiteExportLink to result list
Permanent link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Rows per page
  • 5
  • 10
  • 20
  • 50
  • 100
  • 250
Sort
  • Standard (Relevance)
  • Author A-Ö
  • Author Ö-A
  • Title A-Ö
  • Title Ö-A
  • Publication type A-Ö
  • Publication type Ö-A
  • Issued (Oldest first)
  • Issued (Newest first)
  • Created (Oldest first)
  • Created (Newest first)
  • Last updated (Oldest first)
  • Last updated (Newest first)
  • Disputation date (earliest first)
  • Disputation date (latest first)
  • Standard (Relevance)
  • Author A-Ö
  • Author Ö-A
  • Title A-Ö
  • Title Ö-A
  • Publication type A-Ö
  • Publication type Ö-A
  • Issued (Oldest first)
  • Issued (Newest first)
  • Created (Oldest first)
  • Created (Newest first)
  • Last updated (Oldest first)
  • Last updated (Newest first)
  • Disputation date (earliest first)
  • Disputation date (latest first)
Select
The maximal number of hits you can export is 250. When you want to export more records please use the Create feeds function.
  • 1.
    Ericson, Stefan K.
    et al.
    University of Skövde, Skövde, Sweden.
    Åstrand, Björn
    Halmstad University, School of Information Technology, Halmstad Embedded and Intelligent Systems Research (EIS), CAISR - Center for Applied Intelligent Systems Research.
    Analysis of two visual odometry systems for use in an agricultural field environment2018In: Biosystems Engineering, ISSN 1537-5110, E-ISSN 1537-5129, Vol. 166, p. 116-125Article in journal (Refereed)
    Abstract [en]

    This paper analyses two visual odometry systems for use in an agricultural field environment. The impact of various design parameters and camera setups are evaluated in a simulation environment. Four real field experiments were conducted using a mobile robot operating in an agricultural field. The robot was controlled to travel in a regular back-and-forth pattern with headland turns. The experimental runs were 1.8–3.1 km long and consisted of 32–63,000 frames. The results indicate that a camera angle of 75° gives the best results with the least error. An increased camera resolution only improves the result slightly. The algorithm must be able to reduce error accumulation by adapting the frame rate to minimise error. The results also illustrate the difficulties of estimating roll and pitch using a downward-facing camera. The best results for full 6-DOF position estimation were obtained on a 1.8-km run using 6680 frames captured from the forward-facing cameras. The translation error (x, y, z) is 3.76% and the rotational error (i.e., roll, pitch, and yaw) is 0.0482 deg m−1. The main contributions of this paper are an analysis of design option impacts on visual odometry results and a comparison of two state-of-the-art visual odometry algorithms, applied to agricultural field data. © 2017 IAgrE

  • 2.
    Midtiby, Henrik Skov
    et al.
    The Maersk Mc-Kinney Moller Institute, University of Southern Denmark, Odense, Denmark.
    Åstrand, Björn
    Halmstad University, School of Information Technology, Halmstad Embedded and Intelligent Systems Research (EIS), CAISR - Center for Applied Intelligent Systems Research.
    Jørgensen, Ole
    Operations Management, Aarhus University, Tjele, Denmark.
    Jørgensen, Rasmus Nyholm
    Signal Processing, Aarhus University, Aarhus, Denmark.
    Upper limit for context-based crop classification in robotic weeding applications2016In: Biosystems Engineering, ISSN 1537-5110, E-ISSN 1537-5129, Vol. 146, p. 183-192Article in journal (Refereed)
    Abstract [en]

    Knowledge of the precise position of crop plants is a prerequisite for effective mechanical weed control in robotic weeding application such as in crops like sugar beets which are sensitive to mechanical stress. Visual detection and recognition of crop plants based on their shapes has been described many times in the literature. In this paper the potential of using knowledge about the crop seed pattern is investigated based on simulated output from a perception system. The reliability of position–based crop plant detection is shown to depend on the weed density (ρ, measured in weed plants per square metre) and the crop plant pattern position uncertainty (σx and σy, measured in metres along and perpendicular to the crop row, respectively). The recognition reliability can be described with the positive predictive value (PPV), which is limited by the seeding pattern uncertainty and the weed density according to the inequality: PPV ≤ (1 + 2πρσxσy)−1. This result matches computer simulations of two novel methods for position–based crop recognition as well as earlier reported field–based trials. © 2016 IAgrE

  • 3.
    Persson, Maria
    et al.
    Halmstad University, School of Information Science, Computer and Electrical Engineering (IDE), Halmstad Embedded and Intelligent Systems Research (EIS).
    Åstrand, Björn
    Halmstad University, School of Information Science, Computer and Electrical Engineering (IDE), Halmstad Embedded and Intelligent Systems Research (EIS).
    Classification of crops and weeds extracted by active shape models2008In: Biosystems Engineering, ISSN 1537-5110, E-ISSN 1537-5129, Vol. 100, no 4, p. 484-497Article in journal (Refereed)
    Abstract [en]

    Active shape models (ASMs) for the extraction and classification of crops using real field images were investigated. Three sets of images of crop rows with sugar beet plants around the first true leaf stage were used. The data sets contained 276, 322 and 534 samples, equally distributed over crops and weeds. The weed populations varied between the data sets resulting in from 19% to 53% of occluded crops. Three ASMs were constructed using different training images and different description levels. The models managed to correctly extract up to 83% of the crop pixels and remove up to 83% of the occluding weed pixels. Classification features were calculated from the shapes of extracted crops and weeds and presented to a k-NN classifier. The classification results for the ASM-extracted plants were compared to classification results for manually extracted plants. It was judged that 81–87% of all plants extracted by ASM were classified correctly. This corresponded with 85–92% for manually extracted plants.

1 - 3 of 3
CiteExportLink to result list
Permanent link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf