Bobkov A.V. - Image registration in the real time applications, страница 30
Описание файла
PDF-файл из архива "Bobkov A.V. - Image registration in the real time applications", который расположен в категории "". Всё это находится в предмете "распознавание изображений" из 10 семестр (2 семестр магистратуры), которые можно найти в файловом архиве МГТУ им. Н.Э.Баумана. Не смотря на прямую связь этого архива с МГТУ им. Н.Э.Баумана, его также можно найти и в других разделах. Архив можно найти в разделе "книги и методические указания", в предмете "распознавание изображений" в общих файлах.
Просмотр PDF-файла онлайн
Текст 30 страницы из PDF
A.1). Themanipulator hand holds a video camera and can move it over the surface model,which contains different kinds of landscape. The control computer containstransputer board (INMOS T805d and T425b-25 processors) with installed framegrabber device. The scheme of modelling complex is shown on Fig. A.1, andcommon view is shown on Fig. A.2.137Fig. A.1. The scheme of modelling complexFig.
A.2. Common view of modelling complex138The image of surface model, received by video camera, is digitised by framegrabber and goes to transputer board. The image can be either processed bytransputer board, or passed to the control computer, depending to the experimentrequirements. The image is compared with the map of surface model, and positionof the camera over the model is determined. The correction signal is calculated,and passed through the COM-port to the control device in a form of directive ofthe ARPS language. The control device determines required trajectory ofmanipulator, and passes control signals to the manipulator motors.
Camera movesinto another place, and the described cycle is repeated again.Video capturing and processing systemVideo capturing system includes video camera, frame grabber and transputerboard. The camera is standard video camera that passes analogue PAL-coded videoimage to the frame grabber. The camera has a wide visual angle and observes awhole surface model at once, so the only central part of image of lesser size isused. Camera sends images in interlaced retrace mode, so the odd and even framehalves must be separated on the software layer.The image from the camera is passed to the transputer board INMOSTTFG-4. Frame grabber is the separate board based on T805 transputer andBt252KPJ20 image digitising micro scheme. It allows capturing grey scale imageswith 256 grey gradations.
The maximum frame size is 512x512 pixels; capturingrate is 18 frames per second. Frame grabber board is inserted into slots oftransputer board, and can pass captured images by four separate links, that allowspassing video stream with a speed of 24 frames per second.Transputer board includes four transputers T800 that can provide imageprocessing in a sequential or parallel mode. The software developer toolkitincludes program libraries, translators and debugger for languages OCCAM, ANCIC and 3LC.Host computerHost computer is a personal computer.
The transputer board is installed intoISA slots of host computer, and root transputer uses one of its links to connect it.Host computer can obtain either result of image processing or whole image. Hostcomputer provides processing operations that cannot be presented in parallel mode.That is commonly high-level stages of image analysis.Another purpose of host computer is to produce and send commands to acontrol device.Control deviceControl Device “Sfera-36” is a two-level multiprocessor control device. It isused for contour-positional control of PM-01.Higher layer is an Advanced Robot Programming System (ARPS). ARPSARPS includes central computer with specialised operational system, userconsole, remote control device, and storage device.
Central computer is the“Electronica 11100.1” (based on K1801 processor). It provides the user interface139for the robot control, trajectory calculation and interaction between low level andperipheral devices. The operational system is Nokia ARPS/M B05.RM-1.ARPS provides the robot control by one of the following ways:- By sending directives from the console,- By running the user program- By positioning from the remote control device.Control device “Sfera-36” can process commands in two forms: directiveson Nokia ARPS language from the terminal device or required joint parametersfrom the hand switcher. The host computer can be connected to a terminalconnector through COM-ports and emulates terminal signals.Lower layer includes microprocessor controllers for the robot motorscontrol.ManipulatorManipulator PM-01 is anthropomorphic robot with six degrees of mobility.The manipulator was modified from its original configuration by turning the sixthjoint axis by 90 degrees in order to eliminate the singularity points that originalconfiguration has.
The joint characteristics are shown in table 5, and technicalcharacteristics – in table 6.Table 5. Joints parameters№ of joint123456Motion range,degrees320260284280200532Max speed,degrees/sec1,40,92,14,04,24,0Max moment,Н/м6711357141214Table 6. Technical characteristics of manipulatorNumber of jointsDriverStatic force in the working pointCarrying capacityAccuracySpeedWorking spaceRobot controlTeaching and programming devicesProgramming languagePower supply6DC motorsUp to 60 НUp to 2,5 Kg0,1ммUp to 1 м/с0,92мTwo-level microprocessor deviceConsole, remote control deviceAdvanced Robot ProgrammingSystem Language220 V, 50 Hz140SoftwareThe main difficulty of software development is need to synchronize thework of several components – frame capturing, image analysis and robot control,where each of them has its own time characteristics, modes and requirements tohardware.The module of frame capturing is executed on the transputer environment.
Itmust be implemented on the high level language (OCCAM or 3LC) and compiledinto the executable program of BTL format.The loading and starting of BTL-module is performed by loader program,which is distributed with INMOS transputer SDK. The loader is implemented onthe host computer under the operational system of MS-DOS type, and cannot bemodified or rewritten without additional documentation. On the other hand, theBTL module executed in the transputer board and cannot get access to all requiredresources of the host computer. Furthermore, debugging of BTL module is uneasytask, since it is possible using build-in tools only.Therefore the following model of program complex and data exchange wasdeveloped.The frame capturing module periodically acquires a picture from camera andstores them as a files of raw data using the host file system.
Files can be placedeither on the RAM-disk (to perform fastest data exchange), on the hard disk drive(for the further off-line analysis) or passed to the remote host using NetBEUIprotocol (e.g. if the speed of the local host is not enough, and more powerful toolsare required). Host computer uses multitask operational system, and executesloader program, image analysis module and robotic control module in the pseudoparallel mode.Image analysis module periodically checks the directory where images arestored in order to find latest image from the camera.
Then the image analysis isperformed according to the methods presented in the current research, andtranslation value of the current frame in relation to the sample is calculated. Thesample can be either the first frame marked as a map (map traveling mode) orprevious frame (frame by frame traveling mode).Using the translation value, the robot control module determines requiredco-ordinates A <X,Y,Z,ϕX,ϕY,ϕZ> of manipulator, and sends the command TEACHA through the COM port to the robot control device. This command determines thenew position of the point A, which robot must reach. Reply from the COM portand data accept is performed in the separate thread.The robot control device performs simple cycled program that makes robotto follow into the point A:1.GO A2.GOTO 1When the TEACH A directive is passed, the position of point A is changed,and robot begins moving into the new position.
As a result, the image obtained bythe camera is changed, and process is repeated.141Appendix B. Software for the off-line image analysisThis mode uses the sequence of images either stored on the disk when therobot moves by the given finite trajectory, or sequence of artificial images.RoboVision ModuleThis module was developed in co-operation with Anton Medvedev, BaumanMSTU. The aim of the module is research of edge detection filters and basicfiltering operations. Module contains basic tools of the image filtration, differentedge detectors (both known and newly developed for the research purpose), lineand angle detectors, and statistics collection.
This module was used to obtainsignificant part of experimental results of Chapter 2.Fig. B.1. Common view of RoboVision module interface142WinLoader ModuleThis module was used for research of line detection algorithms anddevelopment of image registration methods. It allows researching all stages of theline extraction and investigating different methods of image registration. Themodule contains tools for time measurement of all stages duration, and has anability to collect and store the statistic data about image registration for furtheranalysis by external tools. This module was used to obtain experimental results ofchapters 3 and 4.Fig.
B.2. Common view of WinLoader module interface143StatCheck ModuleThis module was developed for analysis of statistics collected by WinLoadmodule in order to research the accuracy and reliability of different imageregistration approaches in the different conditions. The StatCheck module orientedto the processing of large data volumes, which cannot be effectively processed bytraditional tools. This module was used to get experimental data on accuracy andreliability of different image registration approaches in the end of Chapter 4.Fig.
B.3. Common view of StatCheck Module interface144.