Open access peer-reviewed chapter

Artificial Vision for Manufacturing System

Written By

Mejia Ugalde Mario and Mejia Ugalde Ignacio

Submitted: March 30th, 2016 Reviewed: July 18th, 2016 Published: November 23rd, 2016

DOI: 10.5772/64938

Chapter metrics overview

1,575 Chapter Downloads

View Full Metrics

Abstract

Artificial vision for manufacturing is a very important step when the process is automatic, so to improve productivity with high quality, this investigation presents a new method for manufacturing by applying artificial intelligence for tool selection; in the proposed method, a camera takes a picture and artificial program is applied to process the image and thus generate it to the size of the cutting tool. This chapter describes the development of a new method by programming using artificial vision to select the cutting tool. The experimental results show that the combination of artificial vision and programming is capable of selecting the correct tool.

Keywords

  • artificial vision
  • cutting tool
  • artificial intelligence
  • image
  • manufacture

1. Introduction

In recent years, artificial vision is impacting the manufacturing process due to the mass adoption in different lines of research such as robotics [1, 2], machining [3, 4], and automotive [5], among others. However, numerous variables affect the machining process, for example, tool materials, total error compensation, concepts of autonomous manufacturing, and process condition monitoring, among others. On the other hand, the computer vision for cutting condition effects as roughness (presented by Sarma et al. [6]), the prediction of cutting conditions (by Gadelmawla et al. [7], and path generation are included for milling Computerized Numerical Control (CNC) machines (by Eladawi et al. [3]).

At present, the development of a system capable of recognizing complex features for tool selection in commercial computer‐aided manufacturing (CAM) transfers this task to the workers who do it based on their own experience, ability, and knowledge. The literature on automatic tool selection using image processing is minimal; on the other hand, simple and easy to implement algorithms are the core of this method [8].

Advertisement

2. Artificial vision in manufacture

This research presents a new technique based on artificial vision using image processing for automatic tool selection in operations of milling‐turning manufacture.

The proposed method starts recognizing an image taken by a photographic camera. Then, different techniques of image processing are applied, such as binarization to convert the image of gray scale to black and white; after segmentation to reduce the contour or increase it if necessary; directional morphology to determine the magnitude, position, and direction of edge of the part; and finally, a structural element with the shape of the tool is moved to generate the trajectory and dimensions of the cutting tool [9].

Advertisement

3. Artificial vision in lathe

This research consists of eight steps for machining 2D objects mainly for two‐axis lathe machines: in step 1, an image designed in CAD or taken from a camera is extracted and saved to a DXF file; the next step (step 2) consists of transforming the DXF file to the BMP file (image); in step 3, a binarization is applied to convert the image to grayscale; step 4 consists of labeling to separate the object of the picture; in step 5, the perimeter of the piece is obtained; in step 6, a partial derivative to obtain the gradient in the edge of the piece is applied; in step 7, a structure element with the shape of the tool is moved in the image; and finally, in steps 8 and 9, a condition is applied to determine if there is no intersection between the selected tool and the object to generate the selection of the tool and the trajectory [9].

The method for automatic selection of the tool consists in identifying when an element structure with the shape of the tool travels in each pixel on the edge of the part in the image. If there is an intersection between the cutting tool and the workpiece, then the diameter of the cutting tool is automatically changed until there is no intersection. These steps and the previous steps are shown in Figure 1, as well as in the following steps [9]:

Step 1. Generation of DXF file starting with a picture

The DXF file contains information of the piece inside the image (Figure 2b), while the CAD file is generated in 3D but it is exported in 2D (Figure 2a) [9].

Step 2. File transformations (DXF to BMP)

When different transformations are applied, it is necessary to know the dimensions of the piece and keep the precision nearest between the part in physical (length in mm) and the image (length in pixels), an analysis applied is shown in Table 1 [9].

Figure 1.

General diagram of the automatic cutting tool selection.

Figure 2.

Principal file, (a) design from CAD and (b) DXF file [9].

Resolution Scale mm Length (pixels) Height (pixels)
800 × 600 4:3 1 15 11
3200 × 2048 25:16 1 62 40
5120 × 4096 5:4 1 100 80
7680 × 4800 16:10 1 151 94
10,000 × 10,000 1:1 1 196 196
50,800 × 50,800 1:1 1 1000 1000

Table 1.

The resolution of this system.

The calibration value is also the resolution of this system and is approximately 1 μm. A method of data extraction [6] is applied to convert the DXF file to BMP file, as shown in Figure 3(a) and (b). The same distance is then automatically determined in pixels in the image, and a calibration value is obtained by dividing the distance in millimeters by the distance in pixels to obtain a relation of mm per pixel:

Relation=distancemmdistancepixelE1

Figure 3.

Files, (a) design without texture (Wire file) and (b) image (BMP file) [9].

All the dimensions are known in the DXF file. In order to find the directional gradient of the edge, image processing is applied.

Step 3. Grayscale to binary image

The workpiece

is a subset
of the image
, where it represents a matrix with m pixels to be stored in an array f in the following way:

λτfτ=1m(λτ=f)E2

where m is the total number of pixels in the image and τ is the region of the image. After starting with the image that is shown in Figure 3(b), a transformation (thresholding) is applied to convert the image of gray scale to binary using Eq. (3):

i=1nk=1m{iff(i,k)>128thenfB(i,k)=1elsefB(i,k)=0}E3

Each pixel has a value of 255 representing the largest tonality (white) and 0 when the minimum (black) cannot work with this whole range of colors.

(image in gray scale 0 multicolor) is needed only for two
(binary image), one for the piece (black) and the other for the rest of the image (white) [9].

Step 4. Labeling of the workpiece into of the image

Figure 4(a) shows the labeling of the image and the piece

(Eq. (4)). Be an image
white and black in the binary space
, where

e=k{fkfB|iffk=0thenxk=0iffk=1thenxk=1}E4

Figure 4.

Labeling definition, (a) labeling of part and (b) labeling of perimeter [9].

Or it can also be written as Eq. (5):

k=m2mi=0n{e(i,k)=[iff(i,k)=1thenfe(i,k)=1][iff(i,k)=0thenfe(i,k)=0]}E5

where

is the labeling image saved in the image of the exit
of Eq. (5), taking half of the image k=m2 on the axis Z. The directional classification of the edge for half of the image in the point p(x,z) is presented in Figure 3(b).

Step 5. Labeling edge and perimeter

The perimeter

can be obtained from the edge (|f|) of the image
by Eq. (6):

P^(θ)=2z0z11+q^')dz=02πr^o2+r^'02dθE6

The vector

represents the partial derivative of the piece in the image f(i,k) with respect to a frame of reference (
are the radii) for each pixel (p(i,k)) (see Figure 4(b).

Another way to calculate an integral is the arithmetic sum of each pixel of the image represented with the symbol

for a frame of reference (of z0 toz1), with the aim of finding the perimeter Sp [10] that represents the labeled edge, as shown in Eqs. (7) and (8) [9]:

N=0{(EN(i,k){p(z,P(z)),{N=N+1z=N}}E7
Sp=k=m2m2i=1n1{N=1m2ifEN(i,k)=1thenf(i,k)EN(i,k),elsef(i,k)f(i,k)}E8

Figure 4(b) shows the labeling of the perimeter

in the edge of the piece for each point p(i,k).

Step 6. Using partial derivatives to obtain the gradient

There are several methods to obtain the gradient (variations in intensity of pixels) (Eq. (9)), but among them the Sobel method is selected for its computational easeas is shown in Eqs. (10) and (11) [11], the gradient generates directional vectors, orientation of the piece, and direction of the contour of the workpiece:

f=gradientf={Gx=xf(x,z)=xi=xu^1Gz=zf(x,z)=zk=zu^2E9

Therefore,

Gx=i=1n1k=1m2{Edge_x(i,k)=z=11x=11e(x,z)Mx(z+2,x+2)}E10
Gz=i=1n1k=1m2{Edge_z(i,k)=z=11x=11e(x,z)Mz(z+2,x+2)}E11

Here,

and
are the transformation matrices to generate the edge applied to the image. The vector gradient represents the maximum change of intensity for the point p(i,k).

The magnitude (|f|) and direction (f) are given by Eqs. (12) and (13), respectively,

|f|=Gx+GzE12
f=tan1GzGxE13

Figure 5(a) illustrates the vector direction of the edge of the part sample.

Figure 5.

Vectors of part, (a) direction vector of part and (b) magnitude of edge direction [9].

Figure 5(b) shows the directional angles according to the trajectory of the edge, when the gradient is applied to binary image in the part.

The magnitude and direction of the edge are saved in the variable Maq(i,k) using Eq. (14) to generate the dimension of structural element:

k=mm21i=1n1{ifexistedgethen{Maq(i,k)<={|f|(i,k)"magnitude"f(i,k)"direction"}}E14

Step 7. Automatic tool selections

The workspace is considered according to the size of the piece but it is necessary to consider other collisions as the turret with the part, these will be considered in future issues [10]. Table 2 shows the common cutting tool, the dimensions of the cutting tool are obtained from manuals and introduced into the software to be converted into structural elements with the length, orientation, and radius of the nose, this part is shown in Figure 6(a). Among the common dimensions are those of 15°, 35°, and 45°, with 9525 mm requiring 36 pixels, 2540 requiring a mesh of 25.5 square units of pixels, and the structural element requiring 414 pixels, see Figure 6(a) [9].

Table 2.

Common inserts (structural element).

Figure 6.

(a) Structural element for right insert, (b) incorrect insert and (c) correct insert [9].

There are several conditions for machining between them and it is necessary to determine if the cut is internal or external, and whether it is an insert right (180°f<270°) or left (270°f<360°). The common are those that are situated in the last two quadrants, the first two quadrants are generally used for internal machining. Eq. (15) generates the movements of the structural element through the image; the angle marks the start of the selection of the cutting tool in machining

:

k=mm21i=1n1{ifMaq(i,k)=1thenselectinsert{μ(i,k)f(angleofedge)}}E15

where μ is the dimension for cutting tool presented in Eq. (16):

k=mm21i=1n1{ifMaq(i,k)=1thenμ(i,k)is{leftinsert,if17.5°μ(i,k)<45°rightinsert,if107°μ(i,k)<180°neuterinsert,ifμ(i,k)=90°knife,ifμ(i,k)=90°(upanddown)}}E16

Table 2 shows the common inserts that are generated in structural elements. Figure 6(a) illustrates a structural element generated with pixels to be moved through the contour so that the contour matches the nose radius of the tool. The number of pixels in a line according to the used resolution of 800 × 600 is five pixels; the evaluation is taken from reference [9]. Second, the pixels that are represented the line in Figure 6(a) are used only to demonstrate the method; the pixels processed are the inner using centroid method.

When a structural element passes through the edge of the workpiece, it is necessary to know if this insert is correct, free of intersection, if there is intersection, and if another insert is treated. The structural element (Figure 6b) is displaced through all the images, when a structural element (Figure 6b) collides (Figure 6c), other structural element is chosen automatically, this process is determined in the software itself. The technique for automatic tool selection may be to change the radius, length, or angle of the insert according to the inserts, obtained from handbooks [9].

After the correct holders and inserts have been selected to work on each area, many changes in cutting tools are probably required. The number of changes can be reduced by ordering the changes with the next procedure (Eq. (17)):

i=1n1k=m2m2{if{(Sp(i,k)=1)andh=11ψ=11λ[i+ψ][k+1]=1}thenTray(i,k)λ[i+ψ][k+1]}E17

To determine the tool path, each pixel should share information with eight neighbors and follow the path of the perimeter (edge

) in the mesh λ[i+ψ][k+1].

Step 8. Generation of trajectory

Figure 7(a) illustrates the method of zig using one way, but there are other methods of creating paths such as zigzag of two paths (back and forth), zig with contour, which follows only the path of the edge of the piece widely used for finishing and roughing at the same time; the follow‐periphery that has the function to be used only for finishing; the trochoidal profile is used in special cases when the piece has peaks, elevations, or very long inclinations; and on the other hand, the generation of trajectory is used only for rough cutting. To generate the trajectories, a structural element with the shape of the cutting tool with angle μ, lengthς, and noise angle β is displaced through all the images from right to left.

Figure 7.

Tool selection (final piece), (a) rough turn (zig) and (b) finish turn (zig with contour) [9].

Finally, to move the tool in the image, it is necessary to use a counter (k=0,i=i+1) with the total number of pixels m. This method is obtained by Eq. (18) and is displayed is Figure 7(a) [9]:

k=mm21i=1n1{Move_Tray(i,k)=f(i,k);{iff(i,k)=Tray(i,k)=1thenk=k+1}}E18

Machining time is given by Eq. (19):

T=i=1nLiviE19
Advertisement

4. Results in lathe

With the proposed method, an image can be directly taken by a camera or other devices for automatic tool selection. The use of an image generated from DXF file is to compare the structural elements (part and tool) in pixels using PI; the original dimensions from DXF file are used to compare them if the design comes from CAD file. This new method, which reduces the complex, complicated, and difficult to understand mathematical algorithms, shows an easy simulated viewing for selecting cutting tool. To show the method for automatic selection of the tool, two pieces were used, the results are presented in Table 2. The generation of trajectories is shown in Figure 7(a) and (b), and the selection of the tool in Table 3 for the first piece.

Rough turn Finish turn
Area Type Line Type
1 L 1 L
2 L 2 R
3 L 3 L
4 R 4 N
5 N 5 R
6 6 K

Table 3.

Selected inserts for part of Figure 7.

Figure 8.

Tool selection, (a) final piece and (b) thread and rough turn (zig) [9].

Table 3 shows the results of the selected inserts by applying the proposed method. Where the numbers represent the area for each insert, and L, R, or N represents the types of inserts more common than before which was proposed relative to its size.

The second example is presented in Figure 8. The selected cutting tools are presented in Table 4, where an existing thread machining is represented by the number four and an internal machining is labeled number five.

Rough turn Finish turn
Area Type Line Type
1 L 1 L
2 L 2
3 R 3 R
4 M12_×_1.25 4
5 D/L 5 D/L

Table 4.

Cutting tool selection for second example.

Figure 9.

Final software for tool selection [9].

The developed software is shown in Figure 9. It was developed in Microsoft Visual C++ 2010 and tested in 2D images generated in any CAD software and exported to DXF file.

Advertisement

5. Conclusions

In this work, a new method for automatic tool selection using techniques of image processing for computer numerical control lathe machines has been presented. The proposed methodology has been implemented in images generated by CAD software and exported to DXF file. The resolution of this system is determined as approximately 1 μm.

From this, it is clear that this methodology can be implemented in commercial CAM software in order to systematize the lathe CNC process. The novelty of this article is the use of image processing to automatically generate the selection of the cutting tool, the design of the part can be from a photograph taken from a camera or other device or directly from a CAD.

Advertisement

6. Artificial vision in mill

A 2.5D solid model can be defined as a cut with a series of 2D tool paths at different Z slices of a 3D solid model. Nowadays, a large percentage of pieces used in the industry have a shape of contours, where the base face is a plane denominated by 2.5 axis. Automatic tool selection in milling operation is one of the important steps of process planning. Moreover, at present, the commercial computer‐aided manufacturing software transfers this task to the worker who does it based on its own experience, ability, and knowledge. Considerable articles have reported about the variables that affect the milling process such as material piece selection [7, 12], tool selection [7], cutting conditions [7], tool materials [12], tool sequence [13], cutting fluids tool selection [12], tool path [6], and control and identification, among others.

The tool selection, task commonly made by a human operator, is an important aspect in machining processes, since if the tool selected is incorrect, it can produce dimensional errors in workpiece, possible crashes, and consequently, reject the piece. In this way, some researches based their principles on the selection of the tool for machining 2.5D parts as in the case of Ahmad et al. [13], who present an optimization algorithm for the problem of tool sequence [8]. Or the method proposed by Lim et al. [14] who used experimental algorithms using mathematical Boolean to determine the optimal setoftools in pockets with the integration of CAD/CAM.

In the same way, Hemant et al. [15] developed an algorithm for the tool selection used by Veeramani tool, although using dynamic programming, human intervention is necessary because of its mathematical complexity [4]. The tool selection using image processing based on the shape is a new method in the literature, the core of this research issimple and easy to implement algorithms [8].

Although the image processing plays a very important role when the images are manipulated inside the morphology (opening operation), the main research is the defects by Tunák et al. [16], the cutting conditions by Sarma et al. [7] and Gadelmawla et al. [3], and the path generation by Eladawi et al. [6, 8].

The contribution of this research is the easy way to simulate the cutting path, automatic tool selection, easy method for 2.5 axis machining in operation of milling, reduction of errors, and not requiring prior knowledge. The novelty of this work is the use of morphology based on the image processing applied for the automatic tool selection. The advantages of this method are: easy to implement, eliminates the decisions and errors when the tool is selected [8].

Advertisement

7. Introductions

Application of directional morphology in the tool and the workpiece with the sole aim of finding the selection of the cutters for 2.5‐axis machining in the bibliography is a new proposal. The method begins by taking the mesh of a 3D object exported to a DXF file (Mejia et al. [8]) or an image taken by a camera. To manipulate this file, it is necessary to extract the coordinates, lengths, and positions of the pixels generated on the frontier of the piece on the surface. The pixels are discretized on the edge of the piece to obtain normal vectors and stored in a text file. The file data gives the necessary information about the orientation of tool when it passes through that place. The dimensions of the tools are stored and discretized; these were extracted from the handbook more common in the industry. The resolution of the system depends on the size of the tool and the complexity of the piece. To improve the algorithm, it is necessary to reduce the size of the pixels, after the pixels of the tool are displaced on the frontier of the piece. Each movement of the tool is inspected if there is no collision.

The image processing techniques used are: (a) morphological operations: using erosion and dilation [17, 18], these methods reduce or increase the contour of the workpiece; (b) binary image: the pixels that contain a gray scale

are converted to black and white
; (c) direction vectors of piece: Sobel model is the common operator because it has better performance and is easy to use [19]; (d) extracting piece: the common method is labeling (scan mask) to register images; (e) Software CAD‐CAM‐CAE: these are the systems that graphics, designs, and simulates, respectively; (f) 2.5D models: the images in the present article are 2.5D model. These models are from a single 2D image having a manual axis that is the axis Z; and (g) structural elements: the cutting tool is represented in pixels, because a greater number of pixels to represent the cutting tool improves accuracy [8].

The work published by Bithika and Asit [18] applies mathematical morphology to detect manufacturing defects. It is an example of the application of image processing, using such techniques for measuring the effect of cutting speed on the surface roughness, which is another example that provides a new strategy in manufacturing, as presented by Sarma et al. [7].

Advertisement

8. Methodology

This research consists of six steps for machining 2D and 2.5D objects mainly for 2.5‐axis lathe machines, an image designed in CAD or taken from a camera is extracted and saved to a DXF file, after transforming the DXF file to the BMP file (image), a binarization is applied to convert the image to grayscale, the labeling to separate the object of the picture, the perimeter of the piece is obtained, a partial derivative to obtain the gradient in the edge of the piece is applied, in step seven, a structure element with the shape of the tool is moved in the image, finally, a condition is applied to determine if there is no intersection, using mathematical morphology, especially erosion and dilation to generate automatic tool selection and tool path as it is shown in Figure 10 [8].

Figure 10.

General diagram of the automatic cutting tool selection [8].

Step 1. Generation of DXF file starting with a picture

The DXF file (Figure 11b) contains information of the piece inside the image (Figure 11a), the CAD file is generated in 3D but it is exported in 2D.

Figure 11.

Original files, (a) design from CAD or camera and (b) DXF file [8].

When different transformations are applied (DXF to BMP), it is necessary to know the dimensions of the piece and keep the precision nearest between the part in physical (length in mm) and the image (length in pixels), an analysis is applied, the calibration value of this system is in micrometer, the adjustment parameters for 1 mm of 800×600 (scale 4:3, 15×11 pixels) and 50,800×50,800 (scale 1:1, 1000×1000 pixels) [8].

The archive DXF file contains the necessary information about the part that is necessary when the DXF file is converted to a BMP file, as shown in Figure 12(a) and (b). The calibration is the relation between the distance in millimeters (distancemm) and the relation of millimeters per pixel (Relation(mmperpixel)).

Figure 12.

Files, (a) design without texture (Wire file) and (b) image (BMP file) [8].

Step 2. Image preprocessing

Binary image: a binary image (

) in 2.5D is a subset χD of 3 if {0,1}2.5 represents the set of functions as χD3 in the set {0,1}, so that any binary image can be represented by a characteristic function χD:3{0,1}. After starting with the image, a transformation (thresholding) is applied to convert the image of gray scale to binary using

i=1nk=1m{if( f(i,j)>T(object pixels)A1=f(i,j)f(i,j)T(background pixels)A2=f(i,j))thenfB(i,j)}E20

where

is equal to

i=1nk=1m{fB(i,j)=Average value ofA1+Average value ofA22}E21

where

is the image in gray scale (image matrix of nxm pixels) and fB(i,j)|nm is the result of the transformation of gray scale into a binary image.

Step 3. Image processing

Detection and labeling of gradient and perimeter in the edge of the piece: The following step is the binary label, or limitation, of the piece

, see Figure 13(a) (Eq. (22)). Be an image
white and black in the binary space

fBj=1mi=1n{e(i,j)={f(i,j)=1fe(i,j)=1f(i,j)=0fe(i,j)=0}}E22

Figure 13.

Labeling definition, (a) labeling of piece and (b) labeling of perimeter [8].

where

is the labeling image saved in the image of exit e(i,j)|nm of Eq. (22). The directional classification of the edge for half of the image in the point p(x,y) is presented in Figure 13(b).

Figure 14.

Vectors of piece, (a) direction vector of piece and (b) magnitude of edge direction [8].

Perimeter of the piece: The perimeter (P^(θ)) can be found by applying an edge (|f|) into the image (f(i,j)|nm) to provide information about the shape of the object, and the labeled perimeter Sp can be defined as the position vector for the edge. Figure 14(b) presents the result of the trajectory of the labeled perimeter Sp represented with label "1" in the point p(i,j) of the image.

Figure 14(a) and (b) show the direction vectors in the contour of the workpiece when image processing is applied.

The magnitude (|f|(i,j)) and direction (f(i,j)) of the labeling gradient and perimeter (f(i,j)) of the edge are saved in the variable

to generate the dimension of structural element:

{Sp(i,j)f(x,y):(|f|(i,j)f(i,j))Maq(i,j)}E23

Therefore,

Maq(i,j)=j=1j=m1i=1i=n1[(|f|(i,j))(f(i,j))]dxdyE24

Step 4. Directional morphology

Directional morphologies to generate size of tool, erosion, and dilation are fundamental operations in morphological image processing, defined for binary images. To grayscale images and to complete lattices, dilation (δ) and erosion (ε) are formed by a structural element.

Figure 15.

Directional morphology, (a) tool size dilatation and (b) tool path dilatation [8].

Dilation allows thinning of the contour of the part in the image

, using a structural element b(i,j), which is developed in Eq. (25), see Figure 15:

(AB)(w,q)=max{f(wi,qj)+b(i,j)}E25

Such that (w‐i,q‐j)DA,(i,j)DB

With D functions with respect to A and B [4].

Eqs. (26) and (27) represent fundamental operations of dilation AB, where A contains all pixels and B is a structural element of cutting tool with B reflected in B a variation of s:

AB={s|(B)sAΦ}E26

Equivalently,

AB={s|((B)sA)A}E27

Dilation is a joining of the translations of one picture for each pixel of an image B, called structural element as

AB=biBAbiE28

Therefore

i=1n1j=1m1|{if[((f(i,j)==0))δδ((f(i,j1)==1)||f(i1,j)==1||f(i,j+1)==1||f(i+1,j)==1]f(i,j)1}|E29
(AΘB)(w,q)=min{f(w+i,q+j)b(i,j)}E30

Such that

(w+i,q+j)DA,(i,j)DBE31

With D functions with respect to A and B [8].

Eqs. (32) and (33) represent fundamental operations of erosion (AΘB), and inverse operation to dilation:

(AΘB)={s|(B)sA}E32
(AΘB)={p|BpA}E33

The set of edges reduction, elimination of white dots, and the expansion of the small black dots of an image B are called erosion:

AΘB=biBAbiE34

Therefore

i=1n1j=1m1|{if[((f(i,j)==1))δδ((f(i,j1)==0)||f(i1,j)==0||f(i,j+1)==0||f(i+1,j)==0]f(i,j)0}|E35

Figure 16.

Directional morphology, (a) tool size erosion and (b) tool path erosion [8].

Modifying Eqs. (29) and (35) of directional morphology to generate the structural element of insert and tool path using the workspace boundaries, edge, and the perimeter function of the piece generating Eq. (39) and as results provided by Figures 16 and 8:

i=1n1j=1m1{SE1(i,j)=a=22(i,j+a)}E36
i=1n1j=1m1{SE2(i,j)=b=22(i+b,j)}E37
i=1n1j=1m1{h=n4n(SEn2(i,j)=c=11d=11(i+c,j+d))}E38
SE=i=1n1j=1m1{{SE1(i,j)}{SE2(i,j)}{h=n4n(SEn2(i,j))}}E39
if{(SEn2)==((SE1)or(SE2))theneliminate(SEn2)}E40

where (SE1) is the first piece of structural element and (SE1) is given by

SE1=a=22(i,j+a)=(i,j2)(i,j1)(i,j)(i,j+1)(i,j+2)E41

Step 5. Automatic tool selections

Table 5 shows the most common cutting tool. The dimensions of the cutting tool are obtained from manuals and introduced into the software to be converted into structural elements with the length, orientation, and radius of the nose, this part is shown in Figure 16(a). Among the common dimensions are those of 15°, 35°, and 45°, with 9525 mm needing 36 pixels, 2540 requiring a mesh of 25.5 square units of pixels, and the structural element having 414 pixels, see Figure 17(a) [8].

Table 5.

Common cutting tool (structural element) used.

The cutting tools are shown in Table 5, where D is the diameter, d is the tip diameter, L is the major length, l is the minor length, FI is the usable length, Sd is presetting, and φ is the corner radius of the cutting tool. The tool is generated from handbooks; although, a real insert as the drilling cutting tool with a diameter of 5.56 mm (D, 21 pixels) requires a structural element as a tool of π(20)2 pixels. For more details see Figure 17(a).

Figure 17.

Tool selection, (a) structural element and (b) incorrect tool and correct tool [8].

The conditions for machining to determine the cutting tool are given by Eq. (42) that generates the movements of the structural element through the image, and the angle labels the start of the selection of the cutting tool in machining

.

j=mm1i=1n1{ifM(i,j)=1thenselectinsert{μ(i,j)diameter}}E42

The structural element (Figure 17a) is designed to create a trajectory using Eq. (42). To check the intersection (Figure 17b), it is necessary that the pixels have the same coordinates. The design of the software to create the automatic tool selection is developed in Microsoft Visual C ++ 2010, generating a matrix with coordinates (i, j) of the image.

After the correct holders and inserts have been selected to work in each area, many changes in cutting tools are probably required. The number of changes can be reduced by ordering the changes with the next procedure:

i=1n1j=1m2{if{(Sp(i,j)=1)andh=11ψ=11λ[i+ψ][j+1]=1}thenTray(i,j)λ[i+ψ][j+1]}E43

To determine the tool path, each pixel should share information with eight neighbors and follow the path of the perimeter (edge SP) in the mesh λ[i+ψ][k+1].

Step 6. Generation of trajectory

In the present article, zig and zig with contour were developments to generate the trajectories. A structural element with the shape of the cutting tool, with diameter D and longitude, is displaced through all images from right to left. If there is no intersection between the structural element and the edge of the piece, other tool is selected [8].

Figure 18.

Tool selection (final piece), (a) boundaries identification and (b) rough milling (zig) [8].

Finally, to move the tool in the image, it is necessary to use a counter (k=0, i=i+1) with the total number of pixels m. This method is obtained by Eq. (44) and is displayed in Figure 18(a) and (b) [8]:

j=mm1i=1n1{Maq(i,j)f(i,j);{iff(i,j)=Tray(i,j)=1thenj=j+1}}E44
Advertisement

9. Results in mill

A new method is presented for tool selection using directional morphology. To validate the method, three examples were proposed.

Figure 19(a) and (b) depict machining with tools CB and C with labels “4” and “3,” respectively. Figure 20 shows four selected tools, using pocket in zigzag with contour, applying dilation because the machining is external. After this, in Figure 19(c), the machined area (labeled with “2”) can be seen. In the same way, the piece finally machined with a B tool is shown in Figure 19(d). Figure 19(e) depicts the final workpiece obtained. In Table 6, parameters of cutting tool selected by the proposed method can be seen.

Figure 19.

Tool selection for first piece based in residues of pixels [8].

Figure 20.

Tool selection for second piece using zigzag with contour [8].

Rough milling Finish milling
Area Type D (mm) D (mm)
1 B 0.0157
2 C 0.1500
3 C 0.3499
4 CB 0.3999

Table 6.

Selected tools for the first piece.

The second piece is shown in Figure 20. In this case, the piece is machined using zigzag with contour and boundaries. An erosion operation, to automatically select the tool, applied to the surface to remove pixels is internal by the geometry desired. The three contours and approximations of machining with different tools marked with labels “3,” “2,” and “1” are depicted in Figure 20(a), (b), and (c), respectively. Finally, Figure 20(d) depicts the machined piece. Table 7 shows the parameters of the three different tools selected by the method (see Table 8).

Figure 21 shows the rough mill with the last piece using zig with contour. Figure 21(a)(c) with their respective tool selections. Figure 21(d) shows the finish milling and Figure 21(e) the final piece.

Rough milling Finish milling
Area Type D (mm) D (mm)
1 B 0.0157
2 C 0.2000
3 C 0.3999

Table 7.

Selected tools for second piece.

Rough milling Finish milling
Area Type D (mm) D (mm)
1 B 0.0157
2 C 0.0499
3 C 0.1500
4 D 0.5999

Table 8.

Selected tools for third example.

Figure 21.

Tool selection for third piece (rough mill using zig with contour) [8].

Figure 22.

Final software for automatic tool selection [8].

Figure 22 shows the software developed in Microsoft Visual C ++ 2010 for the automatic tool selection.

Advertisement

10. Conclusions of mill

Image processing for manufacturing three‐dimensional models that require movements in three axes based on the directional morphology to detect collisions when the workpiece and the tool are moving is a new method in the literature. The automatic selection of cutting tool and the generation of tool paths to manufacture pieces in three‐axis machining have been presented. The advantages of this method arefast and easy‐to‐implement programming, the proposed method can correctly select cutting tool; traditional methods of morphology as dilation and erosion in conjunction with edge piece to create tool dimensions and automatic tool selection are used. In order to diminish the possible error in boundary obtaining, a DXF file was used to compare and correct it when an image is utilized with a resolution of 1 μm.

References

  1. 1. Perez Paina G., Araguas G., Gaydou D., Steiner G., and Rafael Canali L., RoMAA‐II, an open architecture mobile robot. Lat Am Trans IEEE (Revista IEEE Am Lat). 2014; 12(5): 915–921.
  2. 2. Jimenez Moreno R. and Brito M., Path planning for a mobile robot in a 3D environment. In 2014 IEEE Biennial Congress of Argentina (ARGENCON), 2014; 125–129.
  3. 3. Gadelmawla E.S., Eladawi A.E., Abouelatta O.B., et al., Application of computer vision for the prediction of cutting conditions in milling operations. , Part B: J Eng Manuf. 2009; 223: 791–800.
  4. 4. Al‐Kindi G. and Zughaer H., An approach to improved CNC machining using vision‐based system. Mater Manuf Process. 2012; 27(7): 765–774.
  5. 5. Boix E., Ferrer A., Fernandez S., and Sellart X., Vehicle guiding system through image processing in crash and misuse tests. In Proc SAE‐China Congress 2014: Selected Papers. 2015; 411–424.
  6. 6. Eladawi A.E., Gadelmawla E.S., Elewa I.M., et al., An application of computer vision for programming computer numerical control machines. Proc IMechE, Part B: J Eng Manuf. 2003; 217: 1315–1324.
  7. 7. Sarma P.M.M.S., Karunamoorthy L., and Palanikumar K., Surface roughness parameters evaluation in machining GFRP composites by PCD tool using digital image processing. J Reinf Plast Compos. 2009; 28: 1567–1585.
  8. 8. Mejia‐Ugalde M., Trejo‐Hernandez M., Dominguez‐Gonzalez A., Osornio‐Rios R.A., and Benitez‐Rangel J.P., Directional morphological approaches from image processing applied to automatic tool selection in computer numerical control milling machine. Proc IMechE, Part B: J Eng Manuf. 2013; 227: 1607–1619
  9. 9. Mejia‐Ugalde M., Dominguez‐Gonzalez A., Trejo‐Hernandez M., Morales‐Hernandez L.A., and Benitez‐Rangel J.P., New approach for automatic tool selection in computer numerically controlled lathe by applying image processing. Proc IMechE, Part B: J Eng Manuf. 2012; 226: 1298–1308.
  10. 10. Mejia‐Ugalde M., Dominguez‐Gonzalez A., Trejo‐Hernandez M., Morales‐Hernandez L., Osornio‐Rios R., and Benitez‐Rangel J., Triangulation intersection approach from Poissons equation applied to automatic tool selection in computer numerical control mill‐lathe. Proc IMechE, Part B: J Eng Manuf. 2014; 230: 722–731.
  11. 11. Chen T.H., Chang W.T., Shen P.H., and Tarng Y.S., Examining the profile accuracy of grinding wheels used for microdrill fluting by an image‐based contour matching method. Proc IMechE, Part B: J Eng Manuf. 2010; 224: 899.
  12. 12. Siller H.R., Vila C., Rodriguez C.A., and Abellan J.V., Study of face milling of hardened AISI D3 steel with a special design of carbide tools. Int J Adv Manuf Technol. 2009; 40: 12–25.
  13. 13. Ahmad Z., Rahmani K., and D'Souza R.M., Applications of genetic algorithms in process planning: tool sequence selection for 2.5‐axis pocket machining. J Intell Manuf. 2010; 21: 461–470.
  14. 14. Lim T., Corney J., Ritchie J.M., and Clark D.E.R., Optimizing tool selection. Int JProd Res. 2001; 39: 1239–1256.
  15. 15. Ramaswami H., Shaw R.S., and Anand S., Selection of optimal set of cutting tools for machining of polygonal pockets with islands. Int J Adv Manuf Technol. 2011; 53: 963–977.
  16. 16. Maros T., Vladimír B., and Testik M.C., Monitoring chenille yarn defects using image processing with control charts. Tex Res J. 2011; 81: 1344–1353.
  17. 17. Mallik‐Goswami B. and Datta A.K., Detecting defects in fabric with laser‐based morphological image processing. TexRes J. 2000; 70: 758–762.
  18. 18. Li J., Huang P., Wang X., et al. Image edge detection based on beamlet transform. J Syst Eng Electron. 2009; 20: 1–5.
  19. 19. Tufoi M., Vela I., Marta C., et al. Optimization of withdrawing cylinder at vertical continuous casting of steel using CAD and CAE. Int J Mech. 2011; 5: 10–18.

Written By

Mejia Ugalde Mario and Mejia Ugalde Ignacio

Submitted: March 30th, 2016 Reviewed: July 18th, 2016 Published: November 23rd, 2016