hh.sePublications
Change search
Refine search result
1 - 30 of 30
CiteExportLink to result list
Permanent link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Rows per page
  • 5
  • 10
  • 20
  • 50
  • 100
  • 250
Sort
  • Standard (Relevance)
  • Author A-Ö
  • Author Ö-A
  • Title A-Ö
  • Title Ö-A
  • Publication type A-Ö
  • Publication type Ö-A
  • Issued (Oldest first)
  • Issued (Newest first)
  • Created (Oldest first)
  • Created (Newest first)
  • Last updated (Oldest first)
  • Last updated (Newest first)
  • Standard (Relevance)
  • Author A-Ö
  • Author Ö-A
  • Title A-Ö
  • Title Ö-A
  • Publication type A-Ö
  • Publication type Ö-A
  • Issued (Oldest first)
  • Issued (Newest first)
  • Created (Oldest first)
  • Created (Newest first)
  • Last updated (Oldest first)
  • Last updated (Newest first)
Select
The maximal number of hits you can export is 250. When you want to export more records please use the 'Create feeds' function.
  • 1.
    Andreasson, Henrik
    et al.
    Örebro University, Örebro, Sweden.
    Bouguerra, Abdelbaki
    Örebro University, Örebro, Sweden.
    Åstrand, Björn
    Halmstad University, School of Information Technology, Halmstad Embedded and Intelligent Systems Research (EIS), CAISR - Center for Applied Intelligent Systems Research.
    Rögnvaldsson, Thorsteinn
    Halmstad University, School of Information Technology, Halmstad Embedded and Intelligent Systems Research (EIS), CAISR - Center for Applied Intelligent Systems Research.
    Gold-fish SLAM: An application of SLAM to localize AGVs2012In: Field and Service Robotics: Results of the 8th International Conference / [ed] Kazuya Yoshida & Satoshi Tadokoro, Heidelberg: Springer, 2012, 585-598 p.Conference paper (Refereed)
    Abstract [en]

    The main focus of this paper is to present a case study of a SLAM solution for Automated Guided Vehicles (AGVs) operating in real-world industrial environments. The studied solution, called Gold-fish SLAM, was implemented to provide localization estimates in dynamic industrial environments, where there are static landmarks that are only rarely perceived by the AGVs. The main idea of Gold-fish SLAM is to consider the goods that enter and leave the environment as temporary landmarks that can be used in combination with the rarely seen static landmarks to compute online estimates of AGV poses. The solution is tested and verified in a factory of paper using an eight ton diesel-truck retrofitted with an AGV control system running at speeds up to 3m/s. The paper includes also a general discussion on how SLAM can be used in industrial applications with AGVs. © Springer-Verlag Berlin Heidelberg 2014.

  • 2.
    Baerveldt, Albert-Jan
    et al.
    Halmstad University, School of Information Science, Computer and Electrical Engineering (IDE), Halmstad Embedded and Intelligent Systems Research (EIS).
    Salomonsson, Tommy
    Halmstad University, School of Information Science, Computer and Electrical Engineering (IDE), Halmstad Embedded and Intelligent Systems Research (EIS).
    Åstrand, Björn
    Halmstad University, School of Information Science, Computer and Electrical Engineering (IDE), Halmstad Embedded and Intelligent Systems Research (EIS).
    Vision-guided mobile robots for design competitions2003In: IEEE robotics & automation magazine, ISSN 1070-9932, Vol. 10, no 2, 38-44 p.Article in journal (Refereed)
    Abstract [en]

    The use of popular and effective robot-design competitions in teaching system integration in engineering curricula was discussed. Such robot competitions give students open-ended problem spaces, teaches them to work in groups and stimulates creativity. The technical and pedagogical aspects of robot competitions along with their experiences and shortcomings were also discussed.

  • 3.
    Baerveldt, Albert-Jan
    et al.
    Halmstad University, School of Business and Engineering (SET).
    Åstrand, Björn
    Halmstad University, School of Business and Engineering (SET).
    A low-cost colour vision-system for robot design competitions1998In: Mechatronics '98 / [ed] Josef Adolfsson and Jeanette Karlsén, Oxford: Pergamon Press, 1998, 595-600 p.Conference paper (Refereed)
    Abstract [en]

    In this paper we present a low-cost colour vision system mainly intended for robot design competitions, which nowadays is a popular, project-oriented, way of teaching mechatronics in engineering curriculums. The estimated cost is about 450 dollar inclusive colour camera and the system is small enough to be carried on-board relative small mobile robots. The system is build around a signal processor TMS C31. We also present and discuss the experiences made with the system in our robot design competition.

  • 4.
    Bouguerra, Abdelbaki
    et al.
    Centre for Applied Autonomous Sensor Systems (AASS), Örebro University, Sweden.
    Andreasson, Henrik
    Centre for Applied Autonomous Sensor Systems (AASS), Örebro University, Sweden.
    Lilienthal, Achim J.
    Centre for Applied Autonomous Sensor Systems (AASS), Örebro University, Sweden.
    Åstrand, Björn
    Halmstad University, School of Information Science, Computer and Electrical Engineering (IDE), Halmstad Embedded and Intelligent Systems Research (EIS), Intelligent systems (IS-lab).
    Rögnvaldsson, Thorsteinn
    Halmstad University, School of Information Science, Computer and Electrical Engineering (IDE), Halmstad Embedded and Intelligent Systems Research (EIS), Intelligent systems (IS-lab).
    An autonomous robotic system for load transportation2009In: IEEE Conference on Emerging Technologies & Factory Automation, 2009. ETFA 2009, Piscataway, N.J.: IEEE Press, 2009, 1-4 p.Conference paper (Refereed)
    Abstract [en]

    This paper presents an overview of an autonomous robotic system for material handling. The system is being developed by extending the functionalities of traditional AGVs to be able to operate reliably and safely in highly dynamic environments. Traditionally, the reliable functioning of AGVs relies on the availability of adequate infrastructure to support navigation. In the target environments of our system, such infrastructure is difficult to setup in an efficient way. Additionally, the location of objects to handle are unknown, which requires runtime object detection and tracking. Another requirement to be fulfilled by the system is the ability to generate trajectories dynamically, which is uncommon in industrial AGV systems. ©2009 IEEE.

  • 5.
    Bouguerra, Abdelbaki
    et al.
    Centre for Applied Autonomous Sensor Systems (AASS), Örebro University.
    Andreasson, Henrik
    Centre for Applied Autonomous Sensor Systems (AASS), Örebro University.
    Lilienthal, Achim J
    Centre for Applied Autonomous Sensor Systems (AASS), Örebro University.
    Åstrand, Björn
    Halmstad University, School of Information Science, Computer and Electrical Engineering (IDE), Halmstad Embedded and Intelligent Systems Research (EIS), Intelligent systems (IS-lab).
    Rögnvaldsson, Thorsteinn
    Halmstad University, School of Information Science, Computer and Electrical Engineering (IDE), Halmstad Embedded and Intelligent Systems Research (EIS), Intelligent systems (IS-lab).
    MALTA: A System of Multiple Autonomous Trucks for Load Transportation2009In: Proceedings of the 4th European Conference on Mobile Robots: ECMR’09, September 23 – 25, 2009 Mlini/Dubrovnik, Croatia / [ed] Ivan Petrovi´c Achim J. Lilienthal, Zagreb: KoREMA , 2009, 91-96 p.Conference paper (Refereed)
    Abstract [en]

    This paper presents an overview of an autonomous robotic material handling system. The goal of the system is to extend the functionalities of traditional AGVs to operate in highly dynamic environments. Traditionally, the reliable functioning of AGVs relies on the availability of adequate infrastructure to support navigation. In the target environments of our system, such infrastructure is difficult to setup in an efficient way. Additionally, the location of objects to handle are unknown, which requires that the system be able to detect and track object positions at runtime. Another requirement of the system is to be able to generate trajectories dynamically, which is uncommon in industrial AGV systems.

  • 6.
    Ericson, Stefan
    et al.
    School of Technology and Society, University of Skövde, Sweden.
    Åstrand, Björn
    Halmstad University, School of Information Science, Computer and Electrical Engineering (IDE), Halmstad Embedded and Intelligent Systems Research (EIS).
    A vision-guided mobile robot for precision agriculture2009In: Precision agriculture '09: Papers presented at the 7th European conference on precision agriculture, Wageningen, The Netherlands, 6-8 July 2009 / [ed] E. J. van Henten; D. Goense; C. Lokhorst, Wageningen: Wageningen Academic Publishers, 2009, 623-630 p.Conference paper (Refereed)
    Abstract [en]

    In this paper we have developed a mobile robot which is able to perform crop-scale operations using vision as only sensor. The system consists of a row-following system and a visual odometry system. The row following system captures images from a front looking camera on the robot and the crop rows are extracted using Hough transform. Both distance to the rows and heading angle is provided which both are used to control the steering. The visual odometry system uses two cameras in a stereo setup pointing perpendicular to the ground. This system measures the travelled distance by measuring the ground movement and compensate for height variation. Experiments are performed on an artificial field due to the season. The result shows that the visual odometry have accuracy 1.3% of travelled distance.

  • 7.
    Ericson, Stefan
    et al.
    University of Skövde, Skövde, Sweden.
    Åstrand, Björn
    Halmstad University, School of Information Science, Computer and Electrical Engineering (IDE), Halmstad Embedded and Intelligent Systems Research (EIS).
    Algorithms for Visual Odometry in Outdoor Field Environment2007In: Proceedings of the 13th IASTED International Conference on Robotics and Applications: August 29 - 31, 2007, Würzburg, Germany / [ed] Schilling, K, Anaheim, Calif.: ACTA Press, 2007, 287-292 p.Conference paper (Refereed)
    Abstract [en]

    In this paper different algorithms for visual odometry are evaluated for navigating an agricultural weeding robot in outdoor field environment. Today there is an encoder wheel that keeps track of the weeding tools position relative the camera, but the system suffers from wheel slippage and errors caused by the uneven terrain. To overcome these difficulties the aim is to replace the encoders with visual odometry using the plant recognition camera. Four different optical flow algorithms are tested on four different surfaces, indoor carpet, outdoor asphalt, grass and soil. The tests are performed on an experimental platform. The result shows that the errors consist mainly of dropouts caused by overriding maximum speed, and of calibration error due to uneven ground. The number of dropouts can be reduced by limiting the maximum speed and detection of missing frames. The calibration problem can be solved using stereo cameras. This gives a height measurement and the calibration will be given by camera mounting. The algorithm using normalized cross-correlation shows the best result concerning number of dropouts, accuracy and calculation time.

  • 8.
    Ericson, Stefan
    et al.
    School of Technology and Society, University of Skövde, Skövde.
    Åstrand, Björn
    Halmstad University, School of Information Science, Computer and Electrical Engineering (IDE), Halmstad Embedded and Intelligent Systems Research (EIS).
    Row-detection on an agricultural field using omnidirectional camera2010In: 2010 IEEE/RSJ international conference on intelligent robots and systems, Piscataway, N.J.: IEEE Press, 2010, 4982-4987 p.Conference paper (Refereed)
    Abstract [en]

    This paper describes a method of detecting parallel rows on an agricultural field using an omnidirectional camera. The method works both on cameras with a fisheye lens and cameras with a catadioptric lens. A combination of an edge based method and a Hough transform method is suggested to find the rows. The vanishing point of several parallel rows is estimated using a second Hough transform. The method is evaluated on synthetic images generated with calibration data from real lenses. Scenes with several rows are produced, where each plant is positioned with a specified error. Experiments are performed on these synthetic images and on real field images. The result shows that good accuracy is obtained on the vanishing point once it is detected correctly. Further it shows that the edge based method works best when the rows consists of solid lines, and the Hough method works best when the rows consists of individual plants. The experiments also show that the combined method provides better detection than using the methods separately.

  • 9.
    Ericson, Stefan
    et al.
    School of Technology and Society, University of Skövde, Skövde.
    Åstrand, Björn
    Halmstad University, School of Information Science, Computer and Electrical Engineering (IDE), Halmstad Embedded and Intelligent Systems Research (EIS).
    Stereo Visual Odometry for Mobile Robots on Uneven Terrain2008In: WCECS '08 Proceedings of the Advances in Electrical and Electronics Engineering - IAENG Special Edition of the World Congress on Engineering and Computer Science 2008, Washington: IEEE Computer Society, 2008, 150-157 p.Conference paper (Refereed)
    Abstract [en]

    In this paper we present a stereo visual odometry system for mobile robots that is not sensitive to uneven terrain. Two cameras is mounted perpendicular to the ground and height and traveled distance are calculated using normalized cross correlation. A method for evaluating the system is developed, where flower boxes containing representative surfaces are placed in a metal-working lathe. The cameras are mounted on the carriage which can be positioned manually with 0.1 mm accuracy. Images are captured every 10 mm over 700 mm. The tests are performed on eight different surfaces representing real world situations. The resulting error is less than 0.6% of traveled distance on surfaces where the maximum height variation is measured to 96 mm. The variance is measured for eight test runs, total 5.6 m, to 0.040 mm. This accuracy is sufficient for crop-scale agricultural operations.

  • 10.
    Ericson, Stefan
    et al.
    Högskolan i Skövde, Institutionen för teknik och samhälle, Incitament nivå 3.
    Åstrand, Björn
    Halmstad University, School of Information Science, Computer and Electrical Engineering (IDE), Halmstad Embedded and Intelligent Systems Research (EIS).
    Visual Odometry System for Agricultural Field Robots2008In: Proceedings of the World Congress on Engineering and Computer Science 2008, WCECS 2008, October 22 - 24, 2008, San Francisco, USA / [ed] S. I. Ao, Craig Douglas, W. S. Grundfest, Lee Schruben and Jon Burgstone, Hong Kong: International Association of Engineers, 2008, 619-624 p.Conference paper (Refereed)
    Abstract [en]

    In this paper we present a visual odometry system for agricultural field robots that is not sensitive to uneven terrain. A stereo camera system is mounted perpendicular to the ground and height and traveled distance are calculated using normalized cross correlation. A method for evaluating the system is developed, where flower boxes containing representative surfaces are placed in a metal-working lathe. The cameras are mounted on the carriage which can be positioned manually with 0.1 mm accuracy. Images are captured every 10 mm over 700 mm. The tests are performed on eight different surfaces representing real world situations. The resulting error is less than 0.6% of traveled distance on surfaces where the maximum height variation is measured to 96 mm. The variance is measured for eight test runs, total 5.6 m, to 0.040 mm. This accuracy is sufficient for crop-scale agricultural operations.

  • 11.
    Fan, Yuantao
    et al.
    Halmstad University, School of Information Technology, Halmstad Embedded and Intelligent Systems Research (EIS), CAISR - Center for Applied Intelligent Systems Research.
    Aramrattana, Maytheewat
    Halmstad University, School of Information Technology, Halmstad Embedded and Intelligent Systems Research (EIS), CAISR - Center for Applied Intelligent Systems Research.
    Shahbandi, Saeed Gholami
    Halmstad University, School of Information Technology, Halmstad Embedded and Intelligent Systems Research (EIS), CAISR - Center for Applied Intelligent Systems Research.
    Nemati, Hassan Mashad
    Halmstad University, School of Information Technology, Halmstad Embedded and Intelligent Systems Research (EIS), CAISR - Center for Applied Intelligent Systems Research.
    Åstrand, Björn
    Halmstad University, School of Information Technology, Halmstad Embedded and Intelligent Systems Research (EIS), CAISR - Center for Applied Intelligent Systems Research.
    Infrastructure Mapping in Well-Structured Environments Using MAV2016In: Lecture Notes in Computer Science, ISSN 0302-9743, E-ISSN 1611-3349, Vol. 9716, 116-126 p.Article in journal (Refereed)
    Abstract [en]

    In this paper, we present a design of a surveying system for warehouse environment using low cost quadcopter. The system focus on mapping the infrastructure of surveyed environment. As a unique and essential parts of the warehouse, pillars from storing shelves are chosen as landmark objects for representing the environment. The map are generated based on fusing the outputs of two different methods, point cloud of corner features from Parallel Tracking and Mapping (PTAM) algorithm with estimated pillar position from a multi-stage image analysis method. Localization of the drone relies on PTAM algorithm. The system is implemented in Robot Operating System(ROS) and MATLAB, and has been successfully tested in real-world experiments. The result map after scaling has a metric error less than 20 cm. © Springer International Publishing Switzerland 2016.

  • 12.
    Gholami Shahbandi, Saeed
    et al.
    Halmstad University, School of Information Technology, Halmstad Embedded and Intelligent Systems Research (EIS), CAISR - Center for Applied Intelligent Systems Research.
    Åstrand, Björn
    Halmstad University, School of Information Technology, Halmstad Embedded and Intelligent Systems Research (EIS), CAISR - Center for Applied Intelligent Systems Research.
    Modeling of a Large Structured Environment: With a Repetitive Canonical Geometric-Semantic Model2014In: Advances in Autonomous Robotics Systems: 15th Annual Conference, TAROS 2014, Birmingham, UK, September 1-3, 2014. Proceedings / [ed] Michael Mistry, Aleš Leonardis, Mark Witkowski & Chris Melhuish, Heidelberg: Springer, 2014, Vol. 8717, 1-12 p.Conference paper (Refereed)
    Abstract [en]

    AIMS project attempts to link the logistic requirements of an intelligent warehouse and state of the art core technologies of automation, by providing an awareness of the environment to the autonomous systems and vice versa. In this work we investigate a solution for modeling the infrastructure of a structured environment such as warehouses, by the means of a vision sensor. The model is based on the expected pattern of the infrastructure, generated from and matched to the map. Generation of the model is based on a set of tools such as closed-form Hough transform, DBSCAN clustering algorithm, Fourier transform and optimization techniques. The performance evaluation of the proposed method is accompanied with a real world experiment. © 2014 Springer International Publishing.

  • 13.
    Gholami Shahbandi, Saeed
    et al.
    Halmstad University, School of Information Technology, Halmstad Embedded and Intelligent Systems Research (EIS), CAISR - Center for Applied Intelligent Systems Research.
    Åstrand, Björn
    Halmstad University, School of Information Technology, Halmstad Embedded and Intelligent Systems Research (EIS), CAISR - Center for Applied Intelligent Systems Research.
    Philippsen, Roland
    Halmstad University, School of Information Technology, Halmstad Embedded and Intelligent Systems Research (EIS), CAISR - Center for Applied Intelligent Systems Research.
    Semi-Supervised Semantic Labeling of Adaptive Cell Decomposition Maps in Well-Structured Environments2015In: 2015 European Conference on Mobile Robots (ECMR), Piscataway, NJ: IEEE Press, 2015, 7324207Conference paper (Refereed)
    Abstract [en]

    We present a semi-supervised approach for semantic mapping, by introducing human knowledge after unsupervised place categorization has been combined with an adaptive cell decomposition of an occupancy map. Place categorization is based on clustering features extracted from raycasting in the occupancy map. The cell decomposition is provided by work we published previously, which is effective for the maps that could be abstracted by straight lines. Compared to related methods, our approach obviates the need for a low-level link between human knowledge and the perception and mapping sub-system, or the onerous preparation of training data for supervised learning. Application scenarios include intelligent warehouse robots which need a heightened awareness in order to operate with a higher degree of autonomy and flexibility, and integrate more fully with inventory management systems. The approach is shown to be robust and flexible with respect to different types of environments and sensor setups. © 2015 IEEE

  • 14.
    Gholami Shahbandi, Saeed
    et al.
    Halmstad University, School of Information Technology, Halmstad Embedded and Intelligent Systems Research (EIS), CAISR - Center for Applied Intelligent Systems Research.
    Åstrand, Björn
    Halmstad University, School of Information Technology, Halmstad Embedded and Intelligent Systems Research (EIS), CAISR - Center for Applied Intelligent Systems Research.
    Philippsen, Roland
    Halmstad University, School of Information Technology, Halmstad Embedded and Intelligent Systems Research (EIS), CAISR - Center for Applied Intelligent Systems Research.
    Sensor Based Adaptive Metric-Topological Cell Decomposition Method for Semantic Annotation of Structured Environments2014In: 2014 13th International Conference on Control Automation Robotics & Vision (ICARCV), Piscataway, NJ: IEEE Press, 2014, 1771-1777 p., 7064584Conference paper (Refereed)
    Abstract [en]

    A fundamental ingredient for semantic labeling is a reliable method for determining and representing the relevant spatial features of an environment. We address this challenge for planar metric-topological maps based on occupancy grids. Our method detects arbitrary dominant orientations in the presence of significant clutter, fits corresponding line features with tunable resolution, and extracts topological information by polygonal cell decomposition. Real-world case studies taken from the target application domain (autonomous forklift trucks in warehouses) demonstrate the performance and robustness of our method, while results from a preliminary algorithm to extract corridors, and junctions, demonstrate its expressiveness. Contribution of this work starts with the formulation of metric-topological surveying of environment, and a generic n-direction planar representation accompanied with a general method for extracting it from occupancy map. The implementation also includes some semantic labels specific to warehouse like environments. © 2014 IEEE.

  • 15.
    Hedenberg, Klas
    et al.
    Skövde University, Skövde, Sweden.
    Åstrand, Björn
    Halmstad University, School of Information Technology, Halmstad Embedded and Intelligent Systems Research (EIS), CAISR - Center for Applied Intelligent Systems Research.
    3D Sensors on Driverless Trucks for Detection of Overhanging Objects in the Pathway2015In: Autonomous Industrial Vehicles: From the Laboratory to the Factory Floor / [ed] Roger Bostelman & Elena Messina, Conshohocken: ASTM International, 2015, 41-56 p.Chapter in book (Refereed)
    Abstract [en]

    Human-operated and driverless trucks often collaborate in a mixed work space in industries and warehouses. This is more efficient and flexible than using only one kind of truck. However, since driverless trucks need to give way to trucks, a reliable detection system is required. Several challenges exist in the development of an obstacle detection system in an industrial setting. The first is to select interesting situations and objects. Overhanging objects are often found in industrial environments, e.g. tines on a forklift. Second is choosing a detection system that has the ability to detect those situations. The traditional laser scanner situated two decimetres above the floor does not detect overhanging objects. Third is to ensure that the perception system is reliable. A solution used on trucks today is to mount a 2D laser scanner on the top of the truck and tilt the scanner towards the floor. However, objects at the top of the truck will be detected too late and a collision cannot always be avoided. Our aim is to replace the upper 2D laser scanner with a 3D camera, structural light or time-of-flight (TOF) camera. It is important to maximize the field of view in the desired detection volume. Hence, the placement of the sensor is important. We conducted laboratory experiments to check and compare the various sensors’ capabilities for different colors, used tines and a model of a tine in a controlled industrial environment. We also conducted field experiments in a warehouse. The conclusion is that both the tested structural light and TOF sensors have problems to detect black items that is nonperpendicular to the sensor and at the distance of interest. It is important to optimize the light economy, meaning the illumination power, field of view and exposure time in order to detect as many different objects as possible. Copyright © 2016 by ASTM International

  • 16.
    Hedenberg, Klas
    et al.
    School of Tecnology and Society, University of Skövde, Skövde, Sweden.
    Åstrand, Björn
    Halmstad University, School of Information Science, Computer and Electrical Engineering (IDE), Halmstad Embedded and Intelligent Systems Research (EIS).
    A Trinocular Stereo System for Detection of Thin Horizontal Structures2008In: Advances in Electrical and Electronics Engineering: IAENG Special Edition of the World Congress on Engineering and Computer Science 2008, WCECS '08 / [ed] Sio-Iong Ao, Los Alamitos: IEEE Computer Society, 2008, 211-218 p.Conference paper (Refereed)
    Abstract [en]

    Many vision-based approaches for obstacle detection often state that vertical thin structure is of importance, e.g. poles and trees. However, there are also problem in detecting thin horizontal structures. In an industrial case there are horizontal objects, e.g. cables and fork lifts, and slanting objects, e.g. ladders, that also has to be detected. This paper focuses on the problem to detect thin horizontal structures. We introduce a test apparatus for testing thin objects as a complement for the test pieces for human safety described in the European standard EN 1525 safety of industrial trucks - driverless trucks and their systems. The system uses three cameras, situated as a horizontal pair and a vertical pair, which makes it possible to also detect thin horizontal structures. A sparse disparity map based on edges and a dense disparity map is used to identify problems with a trinocular system. Both methods use the sum of absolute difference to compute the disparity maps. Tests show that the proposed trinocular system detects all objects at the test apparatus. If a sparse or dense method is used is not critical. Further work will implement the algorithm in real time and verify it on a final system in many types of scenery.

  • 17.
    Hedenberg, Klas
    et al.
    University of Skövde, School of Technology and Society, Skövde, Sweden.
    Åstrand, Björn
    Halmstad University, School of Information Science, Computer and Electrical Engineering (IDE), Halmstad Embedded and Intelligent Systems Research (EIS).
    Obstacle Detection For Thin Horizontal Structures2008In: World Congress on Engineering and Computer Science: WCECS 2008 : 22-24 October, 2008, San Francisco, USA, Hong Kong: International Association of Engineers, 2008, 689-693 p.Conference paper (Refereed)
    Abstract [en]

    Many vision-based approaches for obstacle detection often state that vertical thin structure is of importance, e.g. poles and trees. However, there are also problem in detecting thin horizontal structures. In an industrial case there are horizontal objects, e.g. cables and fork lifts, and slanting objects, e.g. ladders, that also has to be detected. This paper focuses on the problem to detect thin horizontal structures. The system uses three cameras, situated as a horizontal pair and a vertical pair, which makes it possible to also detect thin horizontal structures. A comparison between a sparse disparity map based on edges and a dense disparity map with a column and row filter is made. Both methods use the Sum of Absolute Difference to compute the disparity maps. Special interest has been in scenes with thin horizontal objects. Tests show that the sparse dense method based on the Canny edge detector works better for the environments we have tested.

  • 18.
    Hedenberg, Klas
    et al.
    School of Technology and Society, University of Skövde, Skövde, Sweden.
    Åstrand, Björn
    Halmstad University, School of Information Science, Computer and Electrical Engineering (IDE), Halmstad Embedded and Intelligent Systems Research (EIS).
    Safety standard for mobile robots: a proposal for 3D sensors2011In: Proceedings of  the 5th European Conference on Mobile Robots, ECMR'2011 / [ed] Achim J. Lilienthal, Tom Duckett, Örebro: Centre for Applied Autonomous Sensor Systems (AASS) , 2011, 245-251 p.Conference paper (Refereed)
    Abstract [en]

    In this paper we present a new and uniform way of evaluate 3D sensor performance. It is rare that standardized test specifications are used in research on mobile robots. A test rig with objects in the industrial safety standard Safety of industrial trucks - driverless trucks and their systems EN1525 is extended by thin vertical and horizontal objects that represent a fork on a forklift, a ladder and a hanging cable. A comparison of atrinocular stereo vision system, a 3D TOF (Time- Of-Flight) range camera and a Kinect device is made to verify the use of the test rig. All sensors detect the objects in the safety standard EN1525. The Kinect and 3D TOF camera shows reliable results for the objects in the safety standard at distances up to 5 m. The trinocular system is the only sensor in the test that detects the thin structures. The proposed test rig can be used to evaluate sensors to detect thin structures.

  • 19.
    Mashad Nemati, Hassan
    et al.
    Halmstad University, School of Information Technology, Halmstad Embedded and Intelligent Systems Research (EIS), CAISR - Center for Applied Intelligent Systems Research.
    Gholami Shahbandi, Saeed
    Halmstad University, School of Information Technology, Halmstad Embedded and Intelligent Systems Research (EIS), CAISR - Center for Applied Intelligent Systems Research.
    Åstrand, Björn
    Halmstad University, School of Information Technology, Halmstad Embedded and Intelligent Systems Research (EIS), CAISR - Center for Applied Intelligent Systems Research.
    Human Tracking in Occlusion based on Reappearance Event Estimation2016In: ICINCO 2016: 13th International Conference on Informatics in Control, Automation and Robotics: Proceedings, Volume 2 / [ed] Oleg Gusikhin, Dimitri Peaucelle & Kurosh Madani, SCITEPRESS, 2016, Vol. 2, 505-511 p.Conference paper (Refereed)
    Abstract [en]

    Relying on the commonsense knowledge that the trajectory of any physical entity in the spatio-temporal domain is continuous, we propose a heuristic data association technique. The technique is used in conjunction with an Extended Kalman Filter (EKF) for human tracking under occlusion. Our method is capable of tracking moving objects, maintain their state hypothesis even in the period of occlusion, and associate the target reappeared from occlusion with the existing hypothesis. The technique relies on the estimation of the reappearance event both in time and location, accompanied with an alert signal that would enable more intelligent behavior (e.g. in path planning). We implemented the proposed method, and evaluated its performance with real-world data. The result validates the expected capabilities, even in case of tracking multiple humans simultaneously.

  • 20.
    Mashad Nemati, Hassan
    et al.
    Islamic Azad University, Abhar branch, Abhar, Iran.
    Åstrand, Björn
    Halmstad University, School of Information Technology, Halmstad Embedded and Intelligent Systems Research (EIS).
    Tracking of People Using Laser Range Sensor in Occlusion Situations2011In: Proceedings of 2011 International Conference on Information and Computer Technology (ICICT), 2011Conference paper (Refereed)
  • 21.
    Midtiby, Henrik Skov
    et al.
    The Maersk Mc-Kinney Moller Institute, University of Southern Denmark, Odense, Denmark.
    Åstrand, Björn
    Halmstad University, School of Information Technology, Halmstad Embedded and Intelligent Systems Research (EIS), CAISR - Center for Applied Intelligent Systems Research.
    Jørgensen, Ole
    Operations Management, Aarhus University, Tjele, Denmark.
    Jørgensen, Rasmus Nyholm
    Signal Processing, Aarhus University, Aarhus, Denmark.
    Upper limit for context-based crop classification in robotic weeding applications2016In: Biosystems Engineering, ISSN 1537-5110, E-ISSN 1537-5129, Vol. 146, 183-192 p.Article in journal (Refereed)
    Abstract [en]

    Knowledge of the precise position of crop plants is a prerequisite for effective mechanical weed control in robotic weeding application such as in crops like sugar beets which are sensitive to mechanical stress. Visual detection and recognition of crop plants based on their shapes has been described many times in the literature. In this paper the potential of using knowledge about the crop seed pattern is investigated based on simulated output from a perception system. The reliability of position–based crop plant detection is shown to depend on the weed density (ρ, measured in weed plants per square metre) and the crop plant pattern position uncertainty (σx and σy, measured in metres along and perpendicular to the crop row, respectively). The recognition reliability can be described with the positive predictive value (PPV), which is limited by the seeding pattern uncertainty and the weed density according to the inequality: PPV ≤ (1 + 2πρσxσy)−1. This result matches computer simulations of two novel methods for position–based crop recognition as well as earlier reported field–based trials. © 2016 IAgrE

  • 22.
    Nemati, Hassan
    et al.
    Halmstad University, School of Information Technology, Halmstad Embedded and Intelligent Systems Research (EIS), CAISR - Center for Applied Intelligent Systems Research.
    Åstrand, Björn
    Halmstad University, School of Information Technology, Halmstad Embedded and Intelligent Systems Research (EIS), CAISR - Center for Applied Intelligent Systems Research.
    Tracking of People in Paper Mill Warehouse Using Laser Range Sensor2014In: UKSim-AMSS Eighth European Modelling Symposium on Computer Modelling and Simulation, EMS 2014 / [ed] David Al-Dabass, Valentina Colla, Marco Vannucci & Athanasios Pantelous, Los Alamitos, CA: IEEE Computer Society, 2014, 52-57 p., 7153974Conference paper (Refereed)
    Abstract [en]

    In this paper a laser scanner based approach for simultaneous detection and tracking of people in an indoor environment is presented. The operation of an autonomous truck, for transporting paper reels in a dynamic environment shared with humans, is considered as the application setting for this work. Here, a human leg detection procedure and an Extended Kalman Filter (EKF) based tracking method are employed for real-time performance. Several experiments with different data sets collected from an autonomous forklift truck in a paper mill warehouse have been performed in an offline situation. The results show how the system is able to detect and track multiple moving people. ©2014 IEEE.

  • 23.
    Persson, Maria
    et al.
    Halmstad University, School of Information Science, Computer and Electrical Engineering (IDE), Halmstad Embedded and Intelligent Systems Research (EIS).
    Åstrand, Björn
    Halmstad University, School of Information Science, Computer and Electrical Engineering (IDE), Halmstad Embedded and Intelligent Systems Research (EIS).
    Classification of crops and weeds extracted by active shape models2008In: Biosystems Engineering, ISSN 1537-5110, E-ISSN 1537-5129, Vol. 100, no 4, 484-497 p.Article in journal (Refereed)
    Abstract [en]

    Active shape models (ASMs) for the extraction and classification of crops using real field images were investigated. Three sets of images of crop rows with sugar beet plants around the first true leaf stage were used. The data sets contained 276, 322 and 534 samples, equally distributed over crops and weeds. The weed populations varied between the data sets resulting in from 19% to 53% of occluded crops. Three ASMs were constructed using different training images and different description levels. The models managed to correctly extract up to 83% of the crop pixels and remove up to 83% of the occluding weed pixels. Classification features were calculated from the shapes of extracted crops and weeds and presented to a k-NN classifier. The classification results for the ASM-extracted plants were compared to classification results for manually extracted plants. It was judged that 81–87% of all plants extracted by ASM were classified correctly. This corresponded with 85–92% for manually extracted plants.

  • 24.
    Petersson, Daniel
    et al.
    Halmstad University, School of Information Science, Computer and Electrical Engineering (IDE), Halmstad Embedded and Intelligent Systems Research (EIS), Intelligent systems (IS-lab).
    Johansson, Jonas
    Halmstad University, School of Information Science, Computer and Electrical Engineering (IDE), Halmstad Embedded and Intelligent Systems Research (EIS), Intelligent systems (IS-lab).
    Holmberg, Ulf
    Halmstad University, School of Information Science, Computer and Electrical Engineering (IDE), Halmstad Embedded and Intelligent Systems Research (EIS), Intelligent systems (IS-lab).
    Åstrand, Björn
    Halmstad University, School of Information Science, Computer and Electrical Engineering (IDE), Halmstad Embedded and Intelligent Systems Research (EIS), Intelligent systems (IS-lab).
    Torque Sensor Free Power Assisted Wheelchair2007In: ICORR '07: 2007 IEEE 10th International Conference on Rehabilitation Robotics : June 12-15, Noordwijk, The Netherlands, Piscataway, N.J.: IEEE Press, 2007, 151-157 p.Conference paper (Refereed)
    Abstract [en]

    A power assisted wheelchair combines human power, which is delivered by the arms through the pushrims, with electrical motors, which are powered by a battery. Todays electric power assisted wheelchairs use force sensors to measure the torque exerted on the pushrims by the user. This leads to rather expensive and clumsy constructions. A new design, which only relies on velocity feedback, thus avoiding the use of expensive force sensors in the pushrims, is proposed in this paper. The control design is based on a simple PD-structure with only two design parameters easily tuned to fit a certain user; one parameter is used to adjust the amplification of the user’s force and the other one is used to change the lasting time of the propulsion influence. Since the new assisting control system only relies on the velocity, the torque sensor free power assisted wheelchair will besides giving the user assisting power also give an assistant, which pushes the wheelchair, additional power. This is a big advantage compared to the pushrim activated one, where this benefit for the assistant is not possible.

  • 25.
    Åstrand, Björn
    Halmstad University, School of Information Science, Computer and Electrical Engineering (IDE), Halmstad Embedded and Intelligent Systems Research (EIS).
    Vision Based Perception for Mechatronic Weed Control2005Doctoral thesis, comprehensive summary (Other academic)
    Abstract [en]

    The use of computer-based signal processing and sensor technology to guide and control different types of agricultural field implements increases the performance of traditional implements and even makes it possible to create new ones. This thesis increases the knowledge on vision-based perception for mechatronic weed control. The contributions are of four different kinds:

    First, a vision-based system for row guidance of agricultural field machinery has been proposed. The system uses a novel method, based on the Hough transform, for row recognition of crop rows.

    Second is a proposal for a vision-based perception system to discriminate between crops and weeds, using images from real situations in the field. Most crops are cultivated in rows and sown in a defined pattern, i.e. with a constant inter-plant distance. The proposed method introduces the concept of using these geometrical properties of the scene (context) for single plant recognition and localization. A mathematical model of a crop row has been derived that models the probability for the positions of consecutive crops in a row. Based on this mathematical model two novel methods for context-based classification between crops and weeds have been developed. Furthermore, a novel method that combines geometrical features of the scene (context) and individual plant features has been proposed. The method has been evaluated in two datasets of images of sugar beet rows. The classification rate was 92 % and 98 %, respectively.

    The third contribution is the design of a mobile agricultural robot equipped with these perception systems and a mechanical weeding tool intended for intra-row weed control in ecologically cultivated crops.

    The fourth contribution is a demonstration of the feasibility of the perception systems in real field environments, especially with respect to robustness and real-time performance. The row guidance system has been implemented in three different row cultivators and performed inter-row weed control at two commercial farms. The robot has proven to be able to follow a row structure by itself, while performing weed control within the seed line of a crop row, i.e. intra-row cultivation. 

  • 26.
    Åstrand, Björn
    et al.
    Halmstad University, School of Information Science, Computer and Electrical Engineering (IDE), Halmstad Embedded and Intelligent Systems Research (EIS).
    Baerveldt, Albert-Jan
    Halmstad University, School of Information Science, Computer and Electrical Engineering (IDE), Halmstad Embedded and Intelligent Systems Research (EIS).
    A mobile robot for mechanical weed control2003In: International Sugar Journal, ISSN 0020-8841, Vol. 105, no 1250, 89-95 p.Article in journal (Refereed)
    Abstract [en]

    This paper presents an autonomous agricultural mobile robot for mechanical weed control in outdoor environments. The robot employs two vision systems: one grey-level vision system that is able to recognise the row structure formed by the crops and to guide the robot along the rows and a second, colour-based vision system that is able to identify a single crop among weed plants. This vision system controls a weeding-tool that removes the weed within the row of crops. It has been shown that colour vision is feasible for single plant identification, i.e. discriminating between crops and weeds. The system as a whole has been verified, showing that the subsystems are able to work together effectively. A first trial in a greenhouse showed that the robot is able to manage weed control within a row of sugar beet plants.

  • 27.
    Åstrand, Björn
    et al.
    Halmstad University, School of Information Science, Computer and Electrical Engineering (IDE), Halmstad Embedded and Intelligent Systems Research (EIS).
    Baerveldt, Albert-Jan
    Halmstad University, School of Information Science, Computer and Electrical Engineering (IDE), Halmstad Embedded and Intelligent Systems Research (EIS).
    A vision based row-following system for agricultural field machinery2005In: Mechatronics (Oxford), ISSN 0957-4158, Vol. 15, no 2, 251-269 p.Article in journal (Refereed)
    Abstract [en]

    In the future, mobile robots will most probably navigate through the fields autonomously to perform different kind of agricultural operations. As most crops are cultivated in rows, an important step towards this long-term goal is the development of a row-recognition system, which will allow a robot to accurately follow a row of plants. In this paper we describe a new method for robust recognition of plant rows based on the Hough transform. Our method adapts to the size of plants, is able to fuse information coming from two rows or more and is very robust against the presence of many weeds. The accuracy of the position estimation relative to the row proved to be good with a standard deviation between 0.6 and 1.2 cm depending on the plant size. The system has been tested on both an inter-row cultivator and a mobile robot. Extensive field tests have showed that the system is sufficiently accurate and fast to control the cultivator and the mobile robot in a closed-loop fashion with a standard deviation of the position of 2.7 and 2.3 cm, respectively. The vision system is also able to detect exceptional situations by itself, for example the occurrence of the end of a row.

  • 28.
    Åstrand, Björn
    et al.
    Halmstad University, School of Information Science, Computer and Electrical Engineering (IDE), Halmstad Embedded and Intelligent Systems Research (EIS).
    Baerveldt, Albert-Jan
    Halmstad University.
    An agricultural mobile robot with vision-based perception for mechanical weed control2002In: Autonomous Robots, ISSN 0929-5593, E-ISSN 1573-7527, Vol. 13, no 1, 21-35 p.Article in journal (Refereed)
    Abstract [en]

    This paper presents an autonomous agricultural mobile robot for mechanical weed control in outdoor environments. The robot employs two vision systems: one gray-level vision system that is able to recognize the row structure formed by the crops and to guide the robot along the rows and a second, color-based vision system that is able to identify a single crop among weed plants. This vision system controls a weeding-tool that removes the weed within the row of crops. The row-recognition system is based on a novel algorithm and has been tested extensively in outdoor field tests and proven to be able to guide the robot with an accuracy of 2 cm. It has been shown that color vision is feasible for single plant identification, i.e., discriminating between crops and weeds. The system as a whole has been verified, showing that the subsystems are able to work together effectively. A first trial in a greenhouse showed that the robot is able to manage weed control within a row of crops.

  • 29.
    Åstrand, Björn
    et al.
    Halmstad University, School of Information Science, Computer and Electrical Engineering (IDE), Halmstad Embedded and Intelligent Systems Research (EIS).
    Bouguerra, Abdelbaki
    Learning Systems Lab (AASS), Dept. of Technology, Örebro University, Sweden.
    Andreasson, Henrik
    Learning Systems Lab (AASS), Dept. of Technology, Örebro University, Sweden.
    Lilienthal, Achim J.
    Learning Systems Lab (AASS), Dept. of Technology, Örebro University, Sweden.
    Rögnvaldsson, Thorsteinn
    Halmstad University, School of Information Science, Computer and Electrical Engineering (IDE), Halmstad Embedded and Intelligent Systems Research (EIS).
    An autonomous robotic system for load transportation2009In: Program and Abstracts, Fourth Swedish Workshop on Autonomous Robotics, SWAR'09 / [ed] Lars Asplund, Västerås: Mälardalen University , 2009, 56-57 p.Conference paper (Other academic)
  • 30.
    Åstrand, Björn
    et al.
    Halmstad University, School of Information Science, Computer and Electrical Engineering (IDE), Halmstad Embedded and Intelligent Systems Research (EIS).
    Johansson, Maria
    Halmstad University, School of Information Science, Computer and Electrical Engineering (IDE), Halmstad Embedded and Intelligent Systems Research (EIS).
    Segmentation of partially occluded plant leaves2006In: IWSSIP 200613th International Conference on Systems, Signals and Image ProcessingSeptember 21-23, 2006, Budapest, Hungary / [ed] Balázs Enyedi, András Reichardt, Stockholm: Harlequin , 2006Conference paper (Refereed)
    Abstract [en]

    N/A

1 - 30 of 30
CiteExportLink to result list
Permanent link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf