Download paper

Moving Object Tracking Mobile Robot Computer Science Essay

In this paper we present the integrating of a nomadic robotic system with a fresh colour based object tracking nomadic automatons control algorithm based on spacial co-ordinates under non-ideal environments. For object tracking we use a ocular Wireless detector system used in the field of robotics for designation and trailing of the object. The plan is designed to capture a colour based object through a Wireless Surveillance Camera interfaced with PCI Based Image Grabber TV Tuner card integrated with LABVIEW and MATLAB package for tracking the object based on the Spatial Coordinates with RGB Color.

The complete algorithm is integrated in LABVIEW Mathscript the end point informations are send consecutive COM port for commanding the Robot action based on the Threshold values. The hardware design including the microcontroller description and the radio informations nexus is discussed.

Keywords-Spatial Coordinates, Wireless Vision, Object trailing, Microcontroller, Wireless RF Link, Mobile Robot

Introduction

Robotic systems are going of all time more complex everyday and the demand for ocular feedback based systems is increasing by the hr for the development of efficient self-adaptive nomadic robotic systems.

In the system we present here, we amalgamate an established colour based tracking algorithm and a robotic system. The trailing and is done on a desktop computing machine and the appropriate signals are sent to a micro-controller through a wireless interface.

The system attempts to maintain the object being tracked in the centre of its field of position. Therefore the automaton gesture is distinct in nature instead than uninterrupted. Incremental signals are sent to the micro-controller, i.

Top Experts
Tutor Janice
Verified expert
4.9 (549)
Prof. Clara
Verified expert
5 (345)
Writer Lyla
Verified expert
5 (876)
hire verified expert

e. if the object has moved further off from the centre of field of position the magnitude of the needed gesture will be greater and frailty versa. Electronic systems onboard the automaton has two primary undertakings, to pass on with the Personal computer and to command the gesture of the automaton. A wireless consecutive nexus is established between the computing machine and the micro-controller utilizing wireless RF faculties. Motion control is achieved by interfacing two geared DC motors to a standard H-Bridge motor driver circuit. The microcontroller interprets the signals received from the computing machine and consequently controls the gesture of the automaton by directing signals to the motor driver circuitry.

The mechanical construction is designed maintaining in position the weight and size demands, and the needed grade of mobility. Figure 1 illustrates the assorted logical and physical connexions between the systems. The paper does non take designing of nomadic automaton this system describes about the integrating of simple nomadic automaton interface with LABVIEW Mathscript for tracking the Object based on Spatial Coordinates. In this paper we describe about image capturing and processing techniques, followed by an debut to existent robotic application to follow the ruddy ball utilizing the consecutive COM port of the Personal computer through wireless interface.

Section II describes about the robotic vision and control. Section III describes about the RGB and Binary Images for sensing of object based on RGB Values. Section IV describes the inside informations of object sensing, positioning and RGB Threshold values scenes. Section V describes the LABVIEW Mathscript interfacing inside informations with system theoretical account execution inside informations and its consequences are discussed in the followers in subdivision VI.

Robotic vision and control

The whole system of doing a automaton to follow a ruddy ball can be divided into four blocks: image acquisition, processing of image, decision-making and gesture control. Image acquisition can be achieved by utilizing a PC-based webcam or a digital picture camera. This device will capture the image and direct it to the camera processor for farther processing in the computing machine. Its chief map is to change over the light energy received into electrical signals. Image treating involves transition of RGB colour images into grayscale images, puting of threshold degrees, impregnation of the characteristics into binary images and puting of cut-off values to take noise in the binary image. Decision-making is done through the package plan and gesture control through either package or changeless monitoring by the operator from the keyboard. Before traveling to the package plan in item, allow us see the co-ordinate systems to understand the image acquisition procedure.

Block Diagram of Robotic Vision and Control System

Pixel co-ordinates.

By and large, the most convenient method for showing locations in an image is to utilize pixel co-ordinates. Here, the image is treated as a grid of distinct elements, ordered from top to bottom and left to compensate. That is, the first constituent ‘r ‘ ( the row ) increases downward, while the 2nd constituent ‘c ‘ ( the column ) increases to the right ( see Fig. 2 ) . For illustration, the information for the pel in the 3rd row, 2nd column is stored in the matrix component ( 3, 2 ) . When the sentence structure for a map uses ‘r ‘ and ‘c, ‘ it refers to the pel co-ordinate system. MATLAB matrix subscripting is used to entree the values of single pels.

Spatial co-ordinates.

See a pel as a square spot. From this position, a location such as ( 3.3, 2.2 ) is a spacial co-ordinate. Here, locations in an image on a plane are described in footings of ‘x ‘ and ‘y. ‘ When the sentence structure uses ‘x ‘ and ‘y, ‘ it refers to the spacial co-ordinate system. Fig. 3 shows the coordinate convention of the spacial co-ordinate system. Notice that ‘y ‘ additions downward. An image may be defined as a two dimensional degree Fahrenheit ( x, y ) map, where ‘x ‘ and ‘y ‘ are spacial co-ordinates, and the amplitude of ‘f ‘ at any brace of co-ordinates ( ten, Y ) is called the strength or grey degree of the image at the point. The finite distinct values of the co-ordinates ( ten, Y ) and amplitude of ‘f ‘ are the digital images. An strength image is a information matrix whose values have been scaled to stand for strengths. When the elements of an strength image are of category ‘uint8 ‘ or category ‘uint16, ‘ they have integer values in the scope [ 0, 255 ] and [ 0, 65535 ] , severally. The category of the elements used in this plan is ‘uint8. ‘

Pixel and Spatial Coordinate images

BINARY AND RGB IMAGES

Binary images ( B & A ; W ) .

A binary image is a logical array of 0 ‘s and 1 ‘s. Numeric-array images dwelling of 0 ‘s and 1 ‘s are converted into binary, with ‘1 ‘ bespeaking white colour ( maximal strength ) and ‘0 ‘ bespeaking black colour ( minimal strength ) :

& gt ; & gt ; imBw= [ 1 0 1 ; 0 0 1 ; 1 1 0 ]

RGB images.

A ruddy, green and bluish ( RGB ) image is an MxNx3 array of colour pels, where each colour pel is a three matching to the ruddy, green and bluish constituents of an RGB image at a specific spacial location. An RGB image may be viewed as a stack of three gray-scale images that, when fed into the RGB colour proctors, produce a colour image. Eight spots are used to stand for the pel values of each constituent of the image. Thus an RGB image corresponds to 24 spots.

object sensing and placement

Object Detection

Capture a frame and shop it in a variable, say, ‘rgb_image ‘

Extract the ruddy, green and bluish constituents of the images and hive away them in variables fR, fG and fB:

fR= rgb_image ( : , : , 1 ) ; % extracts the ruddy constituent.

fG= rgb_image ( : , : , 2 ) ; % extracts the green constituent.

fB= rgb_image ( : , : , 3 ) ; % extracts the bluish constituent.

Here, francium, fG and fB are image matrices.

Following, happen the ruddy object in the image. ( R_THRESHOLD= ) 140, ( G_THRESHOLD= ) 105 and ( B_ THRESHOLD= ) 100 are specific Numberss called ‘threshold. ‘ The technique for happening these Numberss is described subsequently on. The undermentioned statement creates a B & A ; W image array ‘I ‘ :

I= ( ( fRa‰?140 ) & A ; ( fGa‰¤105 ) & A ; ( fBa‰¤100 ) ) ;

That is, the consequence of logically ‘ANDed ‘ image matrices fR, fG and fB is stored in ‘I. ‘ If the undermentioned three conditions are satisfied, the pixel value of the image is set to ‘1 ‘ :

FRa‰?140 if the value of the ruddy constituent of the pel is greater than 140.

fGa‰¤105 if the value of the green constituent of the pel is less than 105

fBa‰¤100 if the value of the bluish constituent of the pel is less than 100

After doing the B & A ; W image, we would happen that apart from the part of the ruddy ball there are besides some unwanted white parts in the image. These unwanted white parts are called ‘noise. ‘ Before you plot the Centre of the image, filter the noisy parts of the image as follows:

Se = strel ( ‘disk ‘ , 20 ) ; % creates a level, discoid structuring component with radius 20

B= imopen ( I, Se ) ; % morphological gap

Final= imclose ( B, Se ) ; % morphological shutting

Morphologic gap removes those parts of an object which can non incorporate the structuring component ; smoothes object contours, interruptions thin connexions and removes thin bulges. Morphological shutting besides tends to smooth the contours of objects besides fall ining narrow interruptions and make fulling long, thin gulfs and holes smaller than the structuring component:

By obtaining the coveted portion, find the Centre of the ball. The undermentioned statement computes all the affiliated constituents in a binary image:

[ L, n ] = bwlabel ( Final ) ,

Here ‘n ‘ is the entire figure of affiliated constituents and ‘L ‘ is the label matrix. ( Each connected constituent is given a alone figure. ) The undermentioned statement:

[ R, c ] = discovery ( L= = K ) % K= 1, 2 aˆ¦n returns the row and column indices for all pels belonging to the Kth object:

rbar= mean ( n ) ; cbar= mean ( degree Celsius ) ;

Variables ‘rbar ‘ and ‘cbar ‘ are the co-ordinates of the Centre of mass. As you have already filtered the image, the concluding image contains merely one white part. But in instance there is a computational mistake due to inordinate noise, you might hold two connected constituents. So organize a cringle from ‘1 ‘ to ‘n ‘ utilizing ‘for ‘ statement, therefore ciphering the Centre of mass for all objects. If there are no constituents in a frame, the control does n’t come in the cringle and ‘rbar ‘ and ‘cbar ‘ remain initialised to zero. For look intoing the end product on the computing machine, use the undermentioned instructions:

imshow ( rgb_image ) ;

secret plan ( cbar, rbar, ‘marker ‘ , ‘ * ‘ , ‘MarkerEdgeColor ‘ , B ) ;

These statements pop-up a window with a ‘blue ‘ grade is plotted on the detected Centre of mass of the ruddy ball. They have been commented out in the chief plan for proving.

Position of the object

The place of the ruddy ball is plotted as described below: Divide the frame captured by the camera ( mention Fig. 4 ) into five parts by agencies of points ‘x1 ‘ and ‘x2 ‘ on X-axiss and points ‘y1 ‘ and ‘y2 ‘ on Y-axis.

Calculate x1, x2, y1 and y2 by the undermentioned method:

x1= x/2-numx ; x2=x/2+numx

y1=y/2-numy ; y2=y/2+numy

‘numx ‘ and ‘numy ‘ are arbitrary Numberss which you have to happen out. These depend on the size of your ball. We have taken numx=120 and numy=30 in our plan. Calculate the co-ordinates of the Centre of the frame, which is nil but ( x/2, y/2 ) . ‘x ‘ is the maximal dimension of X-axis ( in the plan it is 640 ) and ‘y ‘ is the maximal dimension of Y-axis ( in the plan it is 480 ) .

Assorted conditions for observing the place of the ball are:

If the ball is in part 5, it is at the Centre of the frame. The automaton should halt moving.

If the ball is in part 3, it is at the left of the frame. The automaton should travel left.

If the ball is in part 4, it is at the right of the frame. The automaton should travel right.

If the ball is in part 1, it is at the upper portion of the frame. The automaton should travel frontward.

If the ball is in part 2, it is at the lower portion of the frame. The automaton should travel backward.

Specify the above conditions in the codification for determination devising as given below:

( I ) cbara‰?x1 ( end product either ‘0 ‘ or ‘1 ‘ )

( two ) cbara‰¤x2 ( end product either ‘0 ‘ or ‘1 ‘ )

( three ) rbara‰¤y2 ( end product either ‘0 ‘ or ‘1 ‘ )

( four ) rbara‰?y1 ( end product either ‘0 ‘ or ‘1 ‘ )

The above four conditions are considered to be four spots, which can be easy change over the information into a figure for easy calculation. This can be done by the undermentioned codification:

e= ( ( cbar & gt ; = x1 ) *2*2*2 % spot figure 3

+ ( cbar & lt ; = x2 ) *2*2 % spot figure 2

+ ( rbar & gt ; = y1 ) *2 % spot figure 1

+ ( rbar & lt ; = y2 ) ) % spot figure 0

Thereafter, we can bring forth a simple switch-case codification for outputting the appropriate informations through consecutive port, to command the external device such as a automaton. The determination tabular array derived from the above conditions is shown in Table I. ‘X ‘ denotes a “ do n’t care ” state of affairs. Some of the instances are fanciful in the practical universe. The automaton will halt in such state of affairss. For ciphering the dimensions of the frame, use the sizeof ( ) map. This map returns the size of the frame.

Decision doing regulation based on threshold

Decision Table

cbar a‰? x1

cbar a‰¤ x2

rbar a‰? y1

rbar a‰¤ y2

double star

Decimal ( e= )

Move

0

0

0

0

0

0

Ten

0

0

0

1

01

1

Ten

0

0

1

0

10

2

Ten

0

0

1

1

11

3

Ten

0

1

0

0

100

4

Ten

0

1

0

1

101

5

Left

0

1

1

0

110

6

Left

0

1

1

1

111

7

Left

1

0

0

0

1000

8

Ten

1

0

0

1

1001

9

Right

1

0

1

0

1010

10

Right

1

0

1

1

1011

11

Right

1

1

0

0

1100

12

Ten

1

1

0

1

1101

13

Front

1

1

1

0

1110

14

Back

1

1

1

1

1111

15

Stop

Puting the RGB threshold values

The ruddy ball is of peculiar involvement. The stairss for puting the RGB threshold values of the ruddy ball follow:

1. Take at least 10 catchs of the ruddy ball at assorted angles

2. Read each image

3. Expose the image by utilizing ‘imview ‘ map

4. Note down the pel values of the ruddy ball by puting the mouse pointer at assorted parts of the ruddy ball. The threshold for ruddy constituent ( R_THRESHOLD ) should be the least value of the ruddy constituent found in the ruddy ball. The threshold for green constituent ( G_THRESHOLD ) should be the maximal value of the green constituent found in the ruddy ball. The threshold for bluish constituent ( B_THRESHOLD ) should be the maximal value of the bluish constituent found in the ruddy ball whose screenshots shown in Figure 7, 8, 9, 10 and Figure.11 consecutively. The least value of the ruddy constituent is ‘144. ‘ So R_THRESHOLD can be taken as ‘144. ‘ The maximal value of the green constituent is ‘115. ‘ So G_THRESHOLD can be taken as ‘115. ‘ The maximal value of the bluish constituent is ‘100. ‘ So B_THRESHOLD can be taken as ‘100. ‘ Syntax of the maps specified can be had from the MATLAB Help. If the image file is non in the current directory, stipulate the whole way of the file.

SYSTEM Description

This paper focuses on the execution of a radio Mobile Robot control with Object tracking based on Spatial Coordinates and to treat the images utilizing MATLAB. To reassign the informations to the nomadic automaton consecutive port COM is used for reassigning the informations byte by byte so that Wireless control of the automaton is done from LABVIEW with MATHSCRIPT map with system overview is shown in the below Figure 3. Wireless control eliminates the restraint in distance between the colour object and automaton for better trailing from distant location. Wireless technique uses Radio Frequency waves for conveying the control signals. The sender faculty uses the ON-OFF Key Modulation strategy for its transmittal at the frequence of 433.92 MHZ.

Personal computer

LABVIEW with MATHSCRIPT

Image Grabber

RS232

Wireless Television Receiver Module

CONTROL PROGRAM

( MICROCONTROLLER )

Sender

( RF TECHNIQUE )

Wireless Surveillance CAMERA

CONTROL PROGRAM

( MICROCONTROLLER )

Automaton

Actuator

Receiver

( RF TECHNIQUE )

System overview

PCI based Television TUNER card is used for catching the images from the distant location for tracking the object based on the colour with its spacial coordinated, supervising the activities from distant topographic point through LABVIEW Mathscript which is running the algorithm mentioned above. Since the radio camera is placed on Mobile automaton, it ‘s necessary to utilize PCI or USB based Television tuner card for geting the distant images from camera. The PCI card is being programmed by agencies of MATLAB TOOL, in which DLL file act as a span between the MATLAB and PCI card for catching the images from camera. These images are used for future processing.

The consecutive binary information is received by the soap 232 IC from the computing machine. The information is transferred to the microcontroller. Depending Upon the informations received, the accountant transmits 4-bit informations to the encoder as shown in the below Figure 4. The encoder converts the standard parallel informations into consecutive informations and transmits it to TWS 434. The sender so transmits the information at 433.9 MHz. The receiving system RWS 434 receives the familial informations serially and it is given to the decipherer IC. The decipherer converts the consecutive information into parallel informations and is transmitted to the microcontroller. Depending upon the received informations the accountant drives the motor to execute the map. Below Figure.5 shows the RF Receiver interfacing with Microcontroller for nomadic automaton control. Conveying though walls of edifices will cut down the scope ; how much will depend on the figure of walls, and what they are made of.

Wireless RF-433Mhz Interfacing circuit with Personal computer

Wireless RF-433Mhz Interfacing circuit with Robot

Motion control of the automaton

The computing machine processes the image of the ruddy ball and sends out five different informations through its consecutive COM port. The information depends on the location of the ruddy ball, upper, lower, left, right and Centre of the frame, as captured by the camera. By mounting Wireless surveillance camera in forepart of the automaton, so that it acts as a ocular detector. To command the Wireless Mobile robot consecutive informations are transferred from Personal computer by running the MATHLAB codification on the Personal computer integrated with LABVIEW for consecutive informations transportation.

Here the automaton will be controlled by directing the following codifications to the consecutive port:

1. Key ‘F ‘ for forward motion

2. Key ‘L ‘ for left motion

3. Key ‘R ‘ for right motion

4. Key ‘B ‘ for backward motion

5. Key ‘S ‘ for Stop automaton motion

Any other codification will do the automaton halt. The codification for implementing gesture control of the automaton is given at the terminal of this article.

Image Acquisition utilizing MATLAB

The Image Acquisition Toolbox 1.0 is a new merchandise from The MathWorks that enables you to link to an image acquisition device from within a MATLAB session. Based on MATLAB object engineering, the Image Acquisition Toolbox provides maps for making objects that represent the connexion to the device.

The radio camera acquires the picture of the surrounding and it transmits through a radio faculty which is received through a Television receiving system faculty. It is fed to the computing machine through Television tuner PCI card and the picture can be viewed in the computing machine. For geting the existent clip images from distant location we have invoked the vcap2.dll with MATLAB.

Consecutive Communication with LABVIEW

Some of the VISA maps on this pallet for GPIB communicating. The VISA Write and VISA Read maps work with any type of instrument communicating and are the same whether you are making GPIB or consecutive communicating. However, because consecutive communicating requires you to configure excess parametric quantities, you must get down the consecutive port communicating with the VISA Configure Serial Port VI. The VISA Configure Serial Port VI initializes the port identified by VISA resource name to the specified scenes. The timeout sets the timeout value for the consecutive communicating. Baud rate, information spots, para, and flux control specify those specific consecutive port parametric quantities.

Figure 5 shows how to direct the designation question bid *IDN? to the instrument connected to the COM2 consecutive port. The VISA Configure Serial Port VI opens communicating with COM2 and sets it to 9,600 baud, eight informations spots, uneven para, one halt spot, and XON/XOFF package handshaking. Then the VISA Write map sends the bid. The VISA Read map reads back up to 200 bytes into the read buffer, and the Simple Error Handler VI checks the mistake status

Consecutive Communication under LABVIEW Environment

Consequence

For commanding the radio Mobile automaton based on the spacial coordinates we need to see the followers RGB_THRESHOLD image values of the existent clip image acquired from the Wireless Surveillance Camera via PCI based Image Grabber i.e. , FRONT TECH TV Tuner Card. The following values assist the system to take the exact determination regulations based on these values

( R_THRESHOLD= ) 140

( G_THRESHOLD= ) 105 and

( B_ THRESHOLD= ) 100

These values help us to obtain the exact object sensing for the frame 320×240 image size. Once the object is detected, we need to happen the place of the object located in the frame size mentioned above. In order to obtain the exact place of the object we need to split the frame into 4 every bit separated quarter-circles. Based on the object detected on the quarter-circles the automaton control is done through radio RF Modules. Assorted conditions for observing the place of the ball based on the quarter-circles are already discussed in the subdivision IV.

The determination devising is done by implementing simple switch instance for automaton motion based on different place of the object located on the frame and these coding are executed in MATLAB. In order to interface with Wireless Surveillance camera we have used the VCAP2.DLL files for interfacing with the MATLAB since this VCAP2.DLL is capable of accessing up to about 6 Image catching card at the same time.

The complete MATLAB codification is executed under LABVIEW environment by integrating the MATLAB codification along with the LABVIEW. Since LABVIEW has a tool chest known as LABVIEW Mathscript which is used for interfacing the MATLAB and LABVIEW for better efficient application edifice. LABVIEW is used for consecutive communicating to reassign the informations from Personal computer to the Mobile automaton for commanding the action based on the determination devising Rule as described in the subdivision IV under Table I. The end product consequences are shown in below Figures 7, 8,9,10 and 11 consecutively which clearly shows the automaton motions based on the spacial co-ordinates by put to deathing the MATLAB codification under LABVIEW Mathscript.

Robot Backward Movement

Robot Forward Movement

Robot Left Movement

Robot Right Movement

Robot Null action

LABVIEW Program for Object tracking under LABVIEWMathscript

The tantamount automaton control informations are sent from the LABVIEW environment through consecutive communicating COM port for commanding the automaton actions. The automaton control bid is already discussed in the above Section V for better apprehension.

Decision

Vision based object trailing has been disputing research country for many of the research workers for so many decennaries. In this paper our ultimate end is to interface low cost PCI based image grabber card for ocular automaton control from distant location through Wireless means.One of the major advantages of implementing such system is that the low cost Machine vision applications such as Image processing based Inspection system would be build with such PCI based Image grabber cards. Any low cost image grabber cards can be used for interfacing with LABVIEW instead than buying the customized Frame Grabber Cards with outgos. Even more applications such as Biometric, Human Tracking and GIAT analysis would besides be performed with such low cost cards.

FUTUREWORK

The current trial and its consequences motivate towards making farther research on this undertaking. The roadmap for farther work is outlined below. Object sensing would be carried out by background minus utilizing appropriate thresholds which could be used for implementing Autonomous Vision based Turret triping for military applications. Use of two cameras to use stereovision to help in 3D Reconstruction of the object which would able to track the object on both the forepart and rear vision for object tracking A survey of usage of multiple automatons in a web to observe and track down multiple, large objects co-operatively. Study on the consequence of frame rate of the camera and gesture parametric quantities on the public presentation of the system can be provoked.

Cite this page

Moving Object Tracking Mobile Robot Computer Science Essay. (2020, Jun 02). Retrieved from http://studymoose.com/moving-object-tracking-mobile-robot-computer-science-new-essay

Are You on a Short Deadline? Let a Professional Expert Help You
HELP ME WITH WRITING