Open access peer-reviewed chapter

The Geometry and Usage of the Supplementary Fisheye Lenses in Smartphones

Written By

Cumhur Sahin

Submitted: 16 December 2016 Reviewed: 10 May 2017 Published: 02 November 2017

DOI: 10.5772/intechopen.69691

From the Edited Volume

Smartphones from an Applied Research Perspective

Edited by Nawaz Mohamudally

Chapter metrics overview

1,807 Chapter Downloads

View Full Metrics

Abstract

Nowadays, mobile phones are more than a device that can only satisfy the communication need between people. Since fisheye lenses integrated with mobile phones are lightweight and easy to use, they are advantageous. In addition to this advantage, it is experimented whether fisheye lens and mobile phone combination can be used in a photogrammetric way, and if so, what will be the result. Fisheye lens equipment used with mobile phones was tested in this study. For this, standard calibration of ‘Olloclip 3 in one’ fisheye lens used with iPhone 4S mobile phone and ‘Nikon FC‐E9’ fisheye lens used with Nikon Coolpix8700 are compared based on equidistant model. This experimental study shows that Olloclip 3 in one fisheye lens developed for mobile phones has at least the similar characteristics with classic fisheye lenses. The dimensions of fisheye lenses used with smart phones are getting smaller and the prices are reducing. Moreover, as verified in this study, the accuracy of fisheye lenses used in smartphones is better than conventional fisheye lenses. The use of smartphones with fisheye lenses will give the possibility of practical applications to ordinary users in the near future.

Keywords

  • smartphone
  • fisheye lenses
  • equidistant projection model
  • distortion geometry
  • supplementary lenses

1. Introduction

Nowadays, mobile phones are more than a device that can only satisfy the communication need between people. Cameras and other integrated additional devices are found in almost every smartphone. Other than these devices, there are tele, macro and fisheye lenses that can easily be integrated to the smartphones. Some of those lens kits are presented in Refs. [1, 2]. Fisheye lenses that are compliant to mobile phones are one of these new equipments. Since fisheye lenses integrated with mobile phones are lightweight and easy to use, they are advantageous. Additionally, these lenses are cost efficient compared to conventional fisheye lenses. The characteristics of Olloclip lens used in this study are presented in Ref. [3]. Cameras on mobile phones are as capable as compact cameras that we use in our daily lives. Smartphone cameras used for acquiring image instead of conventional cameras have opened a new scientific study field. Another scientific study field is that using smartphone cameras together with the developing technologies has given the opportunity to achieve new study fields that have not been made before. Chugh et al. [4] present a detailed survey of methods for detecting road conditions. Smartphone sensors are gaining importance in this field, as they are cost effective and also increase scalability. Analysing from the research activities, it is certain that this area will gain more importance in recent future. The objective of the research is to improve traffic safety through collecting and distributing up‐to‐date road surface condition information using mobile phones [5]. Perttunen et al. [5] present experimental results from real urban driving data that demonstrate the usefulness of the system. To monitor road and traffic conditions in such a setting, Mohan et al. [6] present Nericell, a system that performs rich sensing by piggybacking on smartphones that users carry with them in normal course. Mohan et al. [6] focus specifically on the sensing component, which uses the accelerometer, microphone, GSM radio and/or GPS sensors in these phones to detect potholes, bumps, braking and honking. Wagner et al. [7] present two techniques for natural feature tracking in real‐time on mobile phones and use an approach based on heavily modified state‐of‐the‐art feature descriptors, namely scale invariant feature transform (SIFT) and Ferns. Object‐wise 3D reconstruction is a cardinal problem in computer vision, with much work being dedicated to it throughout recent years. Unlike other approaches, some approaches use global computation, whereas Prisacariu et al. [8] adopt a local computation method related with signed distance transformation and its derivatives. By this method, 3D renderings are quickly obtained by hierarchical ray casting. Real‐time mobile phone performances and speeds faster than 100 fps on PC are achieved by the tracker and GPU acceleration is not required [8]. Tanskanen et al. [9] propose the first dense stereo‐based system for live interactive 3D reconstruction on mobile phones. Pan et al. [10] present a novel system that allows for the generation of a coarse 3D model of the environment within several seconds on mobile smartphones. The contribution of this work is the presentation of a novel approach to generate visually appealing, textured 3D models from a set of at least three panoramic images on mobile phones without the need for remote processing [10]. Wagner et al. [11] present a novel method for the real‐time creation and tracking of panoramic maps on mobile phones. The maps generated with this technique are visually appealing, very accurate and allow drift‐free rotation tracking. Nowadays, smartphones are widely used in the world, and generally, they are equipped with many sensors. Almazan et al. [12] study how powerful the low‐cost embedded Inertial Measurement Unity (IMU) and Global Positioning System (GPS) could become for intelligent vehicles. Main contribution is the method employed to estimate the yaw angle of the smartphone relative to the vehicle co‐ordinate system. The results show that the system achieves high accuracy, the typical error is 1%, and is immune to electromagnetic interference [12]. Recently, mobile phones have become increasingly attractive for augmented reality (AR). The recent advent of GPS and orientation sensors on commodity mobile devices has led to the development of numerous mobile augmented reality (AR) applications and broader public awareness and use of these applications. By using the phone orientation sensor to display the appropriate subset of the panorama, orientation accuracy can be effectively increased and augmentations tightly registered with the background [13]. Kurz and Benhimane [14] presented novel approaches to use the direction of the gravity measured with inertial sensors to improve different parts in the pipeline of handheld AR applications [14]. Amongst all the possible applications, AR systems can be very useful as visualization tools for structural and environmental monitoring. Porzi et al. [15] presented a successful implementation on an android device of an egomotion estimation algorithm by porting the tracking module of parallel tracking and mapping (PTAM). Porzi et al [15] described the development of the egomotion estimation algorithm for an android smartphone. In recent decades, many indoor positioning techniques have been researched and some approaches have even been developed into consumer products. Two devices are selected, the iPhone 3GS and the iPhone 4, to analyse their sensors for usability of an inertial navigation system. A precise Inertial Navigation System (INS) cannot be completely acquired by a strapdown algorithm because of inaccurate and noisy sensors that are used by both the iPhones. In order to enhance the accuracy, several filters were used. Finally, strapdown algorithms were analysed and verified with related testing and best filter combination was found for each of the devices [16]. Burgess et al. [17] expand on previous work by using a multi‐floor model taking into account dampening between floors, and optimize a target function consisting of least squares residuals, to find positions for WiFis and the smartphone measurement locations [17]. Burgess et al. [18] have presented a method for simultaneously mapping the radio environment and positioning several smartphones in multi‐story buildings.

Computer vision applications for mobile phones are gaining increasing attention due to several practical needs resulting from the popularity of digital cameras in today’s mobile phones. Hadid et al. [19] described the task of face detection and authentication in mobile phones, and experimentally analyse a face authentication scheme using Haar‐like features with AdaBoost for face and eye detection and local binary pattern (LBP) approach for face authentication. Shen et al. [20] address the challenges of performing face recognition accurately and efficiently on smartphones by designing a new face recognition algorithm called opti‐sparse representation classification (opti‐SRC). Sparse representation classification (SRC) is a state‐of‐the‐art face recognition algorithm, which has been shown to outperform many classical face recognition algorithms in OpenCV.

Monitoring aquatic environment is of great interest to the ecosystem, marine life and human health [21]. An efficient method for monitoring marine debris is smartphone‐based aquatic robot (SOAR). It is a robotic system having low cost. The aim is to monitor debris in water environment. It contains a smartphone and a robotic fish platform. Robotic fish have a capability to moving through water and smartphone is used to capture images [22]. Another method for detecting debris is Samba. Samba is an aquatic robot that contains a smartphone and a robotic fish platform to monitor harmful marine debris. Using camera of the smartphone, Samba can recognize aquatic debris in dynamic and complex environments [22]. Maindalkar and Ansari [23] present design of aquatic robot for aquatic pollutants monitoring. The android smartphone is integrated with aquatic robot to capture images and to acquire data of different sensors. The implemented design contains CV algorithm for image processing on openCV platform. The real‐time pollutant detection is done with the CV algorithm efficiently [23].

Muaremi [24] investigate the potential of a modern smartphone and a wearable heart rate monitor for assessing affect changes in daily life. Muaremi et al. [24] use smartphone features and heart rate variability (HRV) measures as predictors for building classification models to discriminate among low, moderate and high perceived stress. As smartphones evolve, researchers are studying new techniques to ease the human‐mobile interaction. User interface of mobile phone can be operated by eye tracking and blink detection functions on EyePhone. These results are preliminary, but they suggest that EyePhone is a favourable tool for driving mobile applications with automation [25]. The advent of mobile sensing technology provides a potential solution to the challenge of collecting repeated information about both behaviours and situations such as to detect the type of situation using the sensors built into today’s ubiquitous smartphones [26]. Sandstrom et al. [26] focused on using location sensors to learn the semantics of places, so that we could examine relationships between place, affect and personality. Sensor‐enabled smartphones are opening a new frontier in the development of mobile sensing applications. The recognition of human activities and context from sensor‐data using classification models underpins these emerging applications [27]. The key contribution of community similarity networks (CSN) is that it makes the personalization of classification models practical by significantly lowering the burden to the user through a combination of crowd‐sourced data and leveraging networks that measure the similarity between users. Lu et al. [28] present Jigsaw, a continuous sensing engine for mobile phone applications that require continuous monitoring of human activities and context. Supporting continuous sensing applications on mobile phones is very challenging. Lu et al. [29] propose StressSense for unobtrusively recognizing stress from human voice using smartphones. Lane et al. [30] discuss the emerging sensing paradigms, and formulate an architectural framework for discussing a number of open issues and challenges emerging in the new area of mobile phone sensing research [30]. Rachuri et al. [31] have presented EmotionSense, a novel system for social psychology study of user emotion based on mobile phones. Rachuri et al. [31] have presented the design of novel components for emotion and speaker recognition based on Gaussian mixture models. The driving vision is a smartphone service, called Mood‐Sense, that can infer its owner’s mood based on information already available in today’s smartphones. In Ref. [32], it is suggested that user mood can be separated into four main types with 91% average accuracy. These results can be obtained with 3 weeks of research data and basic smartphone handling statistics. Although these results are not decisive, they show practicability of mood inference without any microphone and/or camera with bulky power requirements and social interaction [32].

Recently, the calibration methods using display devices such as monitors, tablets or smartphones have come to the forefront [33]. Gruen and Akca [34] report about first experiences in calibration and accuracy validation of mobile phone cameras. Ha et al. [33] propose a novel camera calibration method for defocused images using a smartphone under the assumption that the defocus blur is modelled as a convolution of a sharp image with a Gaussian point spread function (PSF). The effectiveness of the proposed method has been emphasized in several real experiments using a compact display device such as a smartphone [33]. Delaunoy et al. [35] propose a new approach to estimate the geometric extrinsic calibration of all the elements of a smartphone or tablet (such as the screen, the front and the back cameras) by using a planar mirror. Saponaro and Kambhamettu [36] described a method for calibrating a smartphone camera by taking two images at different rotations while tolerating small translations. Ahn et al. [37] were intended to analyse accuracy of smartphone image in determining three‐dimensional location for approximated objects before photo survey system using smartphone is developed, and then evaluate its usability.

Fisheye lenses provide instant wide‐angle images from one point with a single camera. Fisheye optics are placed onto charge couple device (CCD) or complementary metal oxide semiconductor (CMOS) cameras without requiring any complex technology. They do not require an external mirror or rotational device. Thus, these optics are small in size and do not require any maintenance [38]. They have a very short focal length, which produces a hemisphere [39]. By using fisheye lenses, a large area of any surrounding space can be acquired with a single photograph. Therefore, fisheye lenses are useful in most of the applications. In addition to high quality landscape and interior visualizations (e.g. ceiling frescos of historical buildings) in commercial demonstrations or internet presentations, fisheye images are also beneficial for measurement operations [40].

The first fisheye lenses have been created by Hill in 1924 [41], but, they have not been preferred in photogrammetric measurements since they provide images with huge distortions and they do not meet central projection. Using the images obtained from fisheye lens imaging systems in photogrammetric measurement and modeling processes becomes popular in recent years by the help of the development in software and hardware technologies. Later, a significant increase has been seen in terms of volume scientific research on this subject matter. Recently, there have been several academic studies presenting the benefit from fisheye lenses. Fisheye cameras are finding increasing number of applications in surveillance, robotic vision, automotive rear‐view imaging systems, etc. because of their wide‐angle properties [42]. Fisheye lens cameras have also been used during sky observations [43], visual sun compass creation [44], and sunpath diagram derivation [45]. Beekmans et al. [46] present a complete approach for stereo cloud photogrammetry using hemispheric sky imagers. This approach combines calibration, epipolar rectification and block‐based correspondence search for dense fisheye stereo reconstruction for clouds. A novel panoramic imaging system that uses a curved mirror as a simple optical attachment to a fisheye lens is given in Ref. [47]. Streckel et al. [48] describe a visual markerless real‐time tracking system for augmented reality applications. The system uses a firewire camera with a fisheye lens mounted at 10 fps. Brun et al. [49] present a new mobile mapping system mounted on a vehicle to reconstruct outdoor environment in real time. Yamamoto et al. [50] propose a mobile web map interface that is based on a metaphor of the wired fisheye lens. The user can easily navigate through the area surrounding the present location while keeping the focus within the map. These features enable users to find the target quickly. Yamamoto et al. [50] confirmed the advantages of the proposed system by evaluation experiments. The new system will be able to contribute to the novel mobile web map services with fisheye views for mobile terminals such as cellular phones. Ahmad and Lima [51] present a cooperative approach for tracking a moving spherical object in three‐dimensional space by a team of mobile robots equipped with sensors in a highly dynamic environment. Zheng and Li [52] explore the use of a fisheye camera to achieve the scene tunnel acquisition. In Ref. [53], authors have focused on dioptric systems to implement a robot surveillance application for fast and robust tracking of moving objects in dynamic, unknown environments. Another application that uses fisheye lens is a research that examines the use of fisheye lenses as optical sensors on unmanned aerial vehicle (UAV) platform in Queensland Technical University in Australia [54]. Grelsson [55] used a fisheye camera for horizon detection in aerial images. Naruse et al. [56] propose three‐dimensional measurement method of underwater objects using a fisheye stereo camera. In Ref. [57], a novel technique to accurately estimate the global position of a moving car using an omnidirectional camera and untextured three‐dimensional city model is proposed. Today, one of the areas that most frequently benefit from fisheye lenses is applications done in combination with terrestrial laser scanners. Georgantas et al. [58] present a comparison of automatic photogrammetric techniques to terrestrial laser scanning for three‐dimensional modeling of complex interior spaces. The 8 mm fisheye lens that was used allowed us to acquire photos with a global view of the scene and thus with textured zones in every image, which is essential for the scale invariant feature transform (SIFT) algorithm. Image analysis tasks such as 3D reconstruction from endoscopic images require compensation of geometric distortions introduced by the lens system [59]. Hu et al. [60] propose effective pre‐processing techniques to ensure the applicability of face detection tools onto highly distorted fisheye images.

Schneider and Schwalbe [61] present the integration of a geometric model of fisheye lenses and a geometric terrestrial laser scanner model in a bundle block adjustment. Fisheye projection functions are designed such that a greater portion of the scene is projected onto the image sensor on the image plane, at the expense of introducing (often considerable) radial distortion [62]. The fisheye lens camera should be calibrated to be used in applications that require high accuracy [63]. There are different studies in literature, which focus on the calibration of fisheye lenses. Abraham and Forstner [38] presented rigorous mathematical models for the calibration of a stereo system composed of two fisheye lens cameras and for the epipolar rectification of the images acquired by this dual system.

Arfaoui and Thibault [64] have described a method using a compact calibration object for fisheye lens calibration. The setup generated a robust and accurate virtual calibration grid, and the calibration was performed by rotating the camera around two axes. The experimental results and the comparison with a 3D calibration object showed that the virtual grid method is efficient and reliable [64]. Kim and Paik [65] presented a novel 3D simulation method for fisheye lens distortion in a vehicle rear‐view camera. The proposed method creates a geometrically distorted image of an object in 3D space according to the lens specifications. The proposed simulation method can be applied to designing a general optical imaging system for intelligent surveillance as well as a vehicle rear‐view backup camera [65] Torii et al. [66] present a pipeline for camera pose and trajectory estimation, and image stabilization and rectification for dense as well as wide baseline omnidirectional images. The experiments with real data demonstrate the use of the proposed image stabilization method. Five image sequences of a city scene captured by a single hand‐held fisheye lens camera are used as our input [66].

In Ref. [67], Kodak DSC 14 Pro with Nikkor 8 mm fisheye lens is calibrated with an equidistant projection. In addition to decentring, symmetric radial and affinity distortion models, precise mathematical models were used, which were based on stereo‐graphic, equidistant, orthogonal and equisolid‐angle projections. Kannala and Brandt [68] propose a generic camera model, which is suitable for fisheye lens cameras as well as for conventional and wide‐angle lens cameras, and a calibration method for estimating the parameters of the model. Fisheye lenses are not perspective lenses, image resolution in these lenses are not fixed (univocal), illumination is not distributed homogeneously [69]. Upto now, many researchers have considered the relationship between distorted radius and undistorted radius in the image plane ignoring the variation of the angle. Zhu et al. [70] present a fisheye camera model based on the refractive nature of the incoming rays and estimate the model parameters without calibration objects using Micusik’s method [71]. In photogrammetry, the collinearity mathematical model, based on perspective projection combined with lens distortion models, is generally used in the camera calibration process. However, fisheye lenses are designed for the following different spherical projections models such as stereographic, equidistant, orthogonal and equisolid angle [63]. The calibration results of Fuji‐Finepix S3pro camera with Bower‐Samyang 8 mm lens were assessed by the help of precise mathematical models. Bower‐Samyang 8 mm is cheaper than other fisheye lenses and unlike others; it is based on stereographic projection [63].

Most of the fisheye lenses are technically based on equidistant or equisolid‐angle projection. Initially, equisolid‐angle projection geometry is constructed and then diagonal fisheye lenses are constructed. The distortion of the image edges is more significant than fisheye lenses with equidistant projection. The only way to construct orthographic projection geometry is to use sophisticated optical construction. Stereographic projection is not practically realizable [67]. Among the other models proposed, an important one is the equidistant model. The model proposes that the distance between an image point and the centre of radial distortion is proportional to the angle between a corresponding three‐dimensional point, the optical centre and the optical axis [72]. Equidistant fisheye lenses are often used for scientific measurement where the measurement of angles is necessary. Thus, it is also sometimes referred to as an equiangular fisheye lens [73]. Perhaps the most common model is the equidistance projection [68]. Friel et al. [74] use the equidistance projection equation to describe the radial distortion, as this is typically among the most commonly used and inexpensive fisheye lens types. The work described in Ref. [74] shows that it is possible to carry out automatic calibration of fisheye lenses, using information derived from real‐world automotive scenes, and to obtain calibration data to a high degree of accuracy.

The main purpose of this study is to test fisheye lens equipment used with mobile phones. Mobile phone imaging with the additional hardware has been used more popularly not only outside but also in indoor applications. Therefore, hardware properties of this wide‐angle optics will be used in the photogrammetric documentation in the near future for mobile phone imaging. Since fisheye lenses integrated with mobile phones are lightweight and easy to use, they are advantageous. In addition to this advantage, it is experimented whether fisheye lens and mobile phone combination can be used in a photogrammetric way, and if so, what will be the result. In this study, standard calibration of ‘Olloclip 3 in one’ fisheye lens used with iPhone 4S mobile phone and ‘Nikon FC‐E9’ fisheye lens used with Nikon Coolpix8700 are compared based on equidistant model. By using photogrammetric bundle block adjustment, the results of these calibrations are analysed. Geometric properties of these wide‐angle lenses will be more important in the photogrammetric measurement assessment. This study suggests a pre‐calibration process of these kinds of hardware for the photogrammetric process in the test field. In the literature, although there are many geometric camera calibration publications, none of them compares the mobile phone fisheye lens kit with conventional fisheye lens on the fundamentals of photogrammetric measurement assessment. The results of this photogrammetric process are also compared with conventional wide‐angle hardware in this paper.

The second section of this chapter briefly describes fisheye projection models. The third section of this chapter briefly describes equidistant model. The fourth section reports an empirical study for calibration of the combination of iPhone 4S camera with Olloclip 3 in one fisheye lens and Nikon Coolpix8700 camera FC‐09 fisheye lens combination by using equidistant model. The fifth section interprets the results that resulted from the experiment process. The sixth section concludes the study.

Advertisement

2. Fisheye projection models

Pinhole projection is so called because it preserves the rectilinearity of the projected scene (i.e. straight lines in the scene are projected as straight lines on the image plane). The Pinhole (perspektife) projection is shown in Figure 1. The Pinhole (perspektife) projection mapping function is given in Eq. (1).

Figure 1.

Pinhole (perspektife) projection representation.

ru=f.tan(θ)(perspective projection)E1

where f is the distance between the principal point and the image plane, θ is the incident angle (in radians) of the projected ray to the optical axis of the camera and ru is the projected radial distance from the principal point on the image plane. However, for wide field of view (FOV) cameras, under rectilinear projection, the size of the projected image becomes very large, increasing to infinity at an FOV of 180° [62].

Interior orientation parameters (IOPs) can be estimated by a procedure called camera calibration. The perspective bundle, which generated the image, can be reconstructed by this procedure. The principal point co‐ordinates, focal length and coefficient for systematic errors correction (lens distortion: symmetric radial and decentring and affinity) are the IOPs of digital cameras. When additional parameters (IOPs) in Eq. (2) [75] are examined, collinearity equations are the most popular camera calibration method [63, 76].

xf=xx0Δx=fXcZcyf=yy0Δy=fYcZcE2

where, f represents the focal length, and (Xc, Yc, Zc) shows the 3D point co‐ordinates of photogrammetric reference system in Eq. (3); point co‐ordinates of the image are (xf, yf); image point co‐ordinates of the reference system parallel to photogrammetric system are represented as (x', y'), this element originates from image centre and principal point (pp) of the co‐ordinates are (xo, yo).

Xc=r11.(XXCP)+r12.(YYCP)+r13.(ZZCP)Yc=r21.(XXCP)+r22.(YYCP)+r23.(ZZCP)Zc=r31.(XXCP)+r32.(YYCP)+r33.(ZZCP)E3

where rij (i and j from 1 to 3) represents rotation matrix elements and with rij, the object can be used in relation to the image reference system; (X, Y, Z) shows any point’s co‐ordinates in the object reference system and (Xcp, Ycp, Zcp) shows perspective centre (PC) in object reference system [63]. Pinhole (perspektife) projection model is not suitable for fisheye lenses. Fisheye lenses instead are usually designed to obey one of the following projections [68]:

r=2.f.tan(θ/2) (stereographic projection)E4
r=f.θ (equidistant projection)E5
r=2.f.sin(θ/2) (equisolid angle projection)E6
r=f.sin(θ) (orthogonal projection)E7

In Eqs. (1) and (4)(7), the angle between optical axis and incoming ray is shown with θ symbol; the distance between image point and principal point is represented with r, and focal length is represented with f. Equidistance projection can be accepted as the most wide‐spread used fisheye lens model. Figure 2a illustrates the schematic description of different projections for the fisheye lens. Figure 2b shows the difference between pinhole lens and fisheye lens. The images acquired with non‐perspective projection are more near to principal point when the results are compared to the results of perspective projection. Therefore, the view angle of fisheye lens is wider than conventional lens. Moreover, actual image surface of fisheye lens presents a hemisphere in accordance with a pinhole lens plane. Thus, projecting the image on surface of the hemisphere into an actual imaging plane results in a deformation of the fisheye lens [77].

Figure 2.

The principles for various lenses: (a) shows different lens projections, p, p1, p2, p3 and p4 are respectively perspective projection, stereographic projection, equidistance projection, equisolid angle projection and orthogonal projection; the corresponding distances between image points and the principal point are represented with r, r1, r2, r3 and r4; (b) shows the difference between pinhole lens and fisheye lens. In terms of fisheye lens, perspective image’s projection on the hemisphere surface into the image plane is the actual image.

A wide‐angle lens produces geometric distortion in the radial direction called the barrel distortion, since it compresses the peripheral region to contain a wide angle of view in the image plane. Considering this problem, many researchers have proposed various models to correct the barrel distortion of the wide‐angle lens. A two‐dimensional (2D) approximated barrel distortion model is shown in Figure 3, where an original pixel Pu moves towards the centre at Pd along the radial direction in the image plane [65]. A polynomial model was proposed to approximate various types of wide‐angle lenses using the distortion coefficients. The distance of the distorted pixel Pu is determined by the polynomial equation [65].

Figure 3.

Radial distortion in the 2D imaging plane: O represents the image centre, Pu, Pd   R2 are pixel co‐ordinate vectors in the input undistorted and output distorted images, respectively, ru, rd R are the distance of Pu and Pd from centre, and θ=θu=θdis the angle of OPu or, equivalently, OPd.

Advertisement

3. Equidistant projection function for fisheye lenses

In order to model the perfect fisheye lens, scene projections are necessary. These can be defined by two main characteristics. Firstly, field of vision covers 2π steradians, it creates a circular image and the distortions become symmetrical with reference to centre of the image. Secondly, fisheye lens has an infinite depth of field. All objects in the image have a precise focus. Therefore, two postulates, namely the azimuth angle invariability and the equidistant projection rule, govern the formation of non‐linear image distortion. These pre‐suppositions explain the projection of object points into the sensor. They directly affect the eventually developing dewarping algorithm [78].

The azimuth angle invariability, which is the first postulate, determines the projection of points of the plane (which passes through the optical axis that is perpendicular to the sensor plane). The azimuth angle of the object points and their projections onto the sensor remain unchanged due to differences in the object distance or elevation within the content plane [78]. According to Ref. [79], the equidistant lens is ‘preferable for measurement of incidence angles (θ) and azimuth angles. The effect of error of lens position is small, and the linear relation of radial distance (rd) and incidence angle (θ) of a ray from the three‐dimensional point is convenient to analyse’.

The second postulate, the equidistant projection rule, depicts the relationship between radial distance (rd) of an image point on the sensor plane‐zenith (incidence (θ)) angle which is created by the vector of image centre‐world object point in Figure 4. According to this rule, there is a linear relationship between the centre to rd image point radial distance and (θ) zenith angle [78].

Figure 4.

Equidistant projection (Equidistant projection, θ/90=dc/R) [43].

As the zenith angle varies from 0 to 90°, the radial distance of the corresponding image point varies linearly from 0 to a maximum value R, determined by the modelled sphere’s [78].

(rd) on the image plane, which is the radial distance in equidistant projection, is directly proportional to incident ray’s angle. It is equivalent to arc segment’s length, which is located between the z‐axis and the projection ray of point P on the sphere in Figure 5 [62].

Figure 5.

Equidistant fisheye projection function representation.

Thus, the equidistant projection function is given in Eq. (8).

rd=f.θE8

where rd is the fisheye radial distance of a projected point from the centre, f is focal distance and θ represents the incidence angle of a ray which begins from the projected three‐dimensional point into the image plane. In fisheye cameras, following process is performed by the help of this common mapping function. The other mapping functions are stereographic, equisolid and orthogonal [80]. Eq. (9) is derived by substituting arctangent function for θ in Eq. (8). Where ru is the height of the projection on the image plane (the subscript u being used to denote the undistorted projection) [73].

rd=f.arctan(ruf)E9

In equidistant projection model, the distorted radial distance on the image plane is linearly expressed as the projected ray’s angle in radians. Moreover, the length of the arc segment between z‐axis and xp is equivalent to the projected distorted distance rd (xp is the intersection point of the projection ray of point X, which has the projection sphere) [73].

Most real optical systems have some undesirable effects, rendering the assumption of the pinhole camera model inaccurate. The most evident of these effects is radial barrel distortion, particularly noticeable in fisheye camera systems, where the level of this distortion is relatively extreme [62]. For most of the applications, the effect of radial distortion can be negligible in normal and narrow field of view (FOV) cameras. However, radial distortion can cause some problems in wide‐angle and fisheye cameras both in terms of visual issues and in the processing of computer vision applications such as object detection, recognition and classification processes [73]. Because of the distortion of the radial lens, points on the image plane are displaced from their ideal position into rectilinear pinhole camera model in a non‐linear way. The movement occurs in a radial axis from distortion centre on the equidistant image plane. The image in the foveal areas has a better resolution because of the displacement factor of fisheye optics. In addition, the peripheral areas of the image satisfy a resolution that decreases non‐linearly [81].

Additional parameters to compensate for deviations of the geometric fisheye model from the physical reality are the same parameters that are applied, as they are in common use, for central perspective lenses [69]. Accordingly, the equidistant projection function with additional parameters is given in Eq. (10).

rd=f.arctan(ruf)+Additional ParametersE10

Due to the particularly high levels of distortion present in fisheye cameras, there have been several alternative models developed [81]. Some models are fisheye transform, field of view, division model, and polynomial model [82]. The work in Ref. [67], investigates the addition of the brown‐parameters to the basic geometric fisheye model to compensate for the remaining ‘systematic effects’ [82].

Three co‐ordinate systems are used in order to define the projection of an object point into a hemispherical fisheye‐dimensional image. These are: the superordinated cartesian object co‐ordinate system (X, Y, Z) and the camera co‐ordinate system (x, y, z) in Figure 6. The image co‐ordinate system (x′, y′) is defined similar to its usual definition in photogrammetric applications. So, the image centre becomes the origin. The x′ and y′ axes are parallel with the x and y axes of camera co‐ordinate system [67]. The geometric concept is based on the dependence of the image radius r′ and the angle of incidence θ [61].

Figure 6.

Geometrical model of a fisheye camera.

Object co‐ordinates are transformed into the camera co‐ordinate system. Eq. (11), where X is the co‐ordinate vector in the object co‐ordinate system, x is the co‐ordinate vector in the camera co‐ordinate system, R is the rotation matrix and X0 is the translation between object and camera co‐ordinate system:

x=R1(XX0)E11

The incidence angle θ in the camera co‐ordinate system is defined as follows:

tanθ=x2+y2zE12

Instead of functions for the image radius r′, functions for the image co‐ordinates x′ and y′ are required. For this purpose, Eq. (13) is applied:

r=x2+y2E13

After transformations of the equations described above, the final fisheye projection equations for the image co‐ordinates is derived. The model equations are finally extended by the co‐ordinates of the principal point x′0 and y′0 Eq. (14) and the correction terms Δx′ and Δy′ [Eqs. (15) and (16)], which contain additional parameters to compensate for systematic effects.

Equidistant projection:

x=c.arctanx2+y2z(yx)2+1+x0+Δx         y=c.arctanx2+y2z(xy)2+1+y0+ΔyE14
Δx=x.(A1r2+A2r4+A3r6)+B1.(r2+2x2)+2B2xy+C1.x+C2.yE15
Δy=y.(A1r2+A2r4+A3r6)+2B1xy+B2.(r2+2y2)E16

where;

A1, A2, and A3 are radial distortion parameters,

B1 and B2 are decentric distortion parameters,

C1 and C2 are horizontal scale factor and shear factor, respectively, and

c is the camera constant, which equals to focal distance.

Advertisement

4. Experiments

The characteristics of the cameras and fisheye lenses, which were chosen for the application, are given below.

Nikon Coolpix8700 digital camera has 8 megapixels resolution and CCD sensor. The 8x optical Zoom‐Nikkor lens (f/2.8 – 4.2) offers a focal range of 8.9–71.2 mm [83]. Nikon FC‐E9 fisheye lens: focal length of the camera’s lens reduced to x0.2. Provides approximate 183° (COOLPIX 5700)/190° (COOLPIX 5400) view angle [83].

iPhone 4S camera has 8 megapixels resolution and CMOS sensor. Its focal length is 35 mm [84]. Olloclip 3 in one fisheye lens: The Olloclip is a device providing three different lens options for iPhone, these are wide‐angle, fisheye and macro. The Olloclip with the fisheye lens acquires 180° field of view [85]. Table 1 shows technical specifications of the cameras used in the application.

iPhone 4SNikon Coolpix 8700
SensorCMOS 1/3.2′’ sensor sizeCCD 2/3″
Image resolution3264 × 2448 (8.0 MP)3264 × 2448 (8.0 MP)
Focal length*4.324602 mm9.027620 mm
Pixel size*1.4 µm2.7 µm
Digital zoom valuesUp to 5×Up to 4×
Aspect ratio4:34:3; 3:2
LCD size3.5″1.8”

Table 1.

Technical specifications of the cameras.

Notes: *These are the values obtained after separate calibrations performed before the application in PI3000 software for iPhone 4S and Nikon Coolpix8700 cameras used in the application. The pictures in the resolution of 3264 × 2448 are taken in this focal length.

The Olloclip 3 in one was mounted on the iPhone 4S. Images acquired with iPhone 4S and Olloclip 3 in one fisheye lens combination were captured with a focal distance of 4.28 mm. Images have 3264 × 2448 pixels and 1.4 μm pixel width.

The calibration field used in this study is a satellite antenna having 150 cm diameter with 112 control points on it. It was chosen since it has a smooth digital surface model and it is geometrically similar to the lens surface model. In this way, the analysis of the errors caused by the geometry of the objective and a balanced distribution of the depth differences over the image acquisition line on the surface model is accomplished. Point location accuracy is approximately 30–35 μm. In order to get the determined point location accuracy, a geodesic Wild T3 theodolite was chosen for direction measurements. In total, five serial measurements were made horizontally and vertically with Wild T3 [86]. Used calibration field is complying with self‐calibration model, which is a dish antenna model. Figure 7 shows the images of the calibration field taken by the two camera‐lens combinations. The images of the calibration field were taken with the minimum focal length of the each camera without zooming.

Figure 7.

Image of calibration field (Left picture: iphone 4S camera with Olloclip 3 in one; right picture: Nikon Coolpix8700 camera with Nikon FC‐E9 fisheye lens).

The application benefits from the comparison made over iPhone 4S Olloclip 3 in one camera fisheye lens combination with Coolpix8700 camera FC‐09 fisheye lens combination in terms of equidistant fisheye model [Eq. (7)] which gives the best result in bundle block adjustment. In the application, the calibration values derived from equidistant fisheye model for iPhone 4S Olloclip 3 in one camera fisheye lens combination and Coolpix8700 camera FC‐09 fisheye lens combination are compared. Eqs. (8) and (9) use A1, A2, A3, B1, B2, C1, C2 coefficients as calibration parameters. Nine of 112 control points are considered passing points, while 103 of them are full control points. Thirteen images of calibration field were taken by iPhone 4S Olloclip 3 in one camera fisheye lens combination from different locations taking into consideration free network adjustment rules. The same procedure was also applied for Coolpix8700 camera FC‐09 fisheye lens combination. After then, the two‐dimensional image co‐ordinates of 112 control points for 13 images were measured in Pictran D software for iPhone 4S Olloclip 3 in one camera fisheye lens combination. The same procedure was also applied for Coolpix8700 camera FC‐09 fisheye lens combination. The measurements of the two‐dimensional image co‐ordinates in Pictran D software are shown in Figure 8.

Figure 8.

Measurement of image co‐ordinates of the fisheye lenses in Pictran D software (Left picture: taken from the combination of iPhone 4S camera with Olloclip 3 in one; right picture: taken from the combination of Nikon Coolpix8700 camera with Nikon FC‐E9 fisheye lens).

The resulting image co‐ordinates were evaluated in bundle block adjustment software developed by Dr. Danilo Schneider in Dresden Technical University in Germany. According to the bundle block adjustment results, Tables 2 and 3 were acquired.

iPhone 4S with Olloclip 3 in oneNIKON Coolpix8700 with NIKON FC‐E9
Sigma00.000990.00163
Convergence00
Max. iteration100100
Required iteration1832
Calculation time3.57 sn6.41 sn
Unknown396396
Observations28682838

Table 2.

Calibration results of two different fisheye in bundle adjustment software.

iPhone 4S with Olloclip 3 in one (mm)Nikon Coolpix8700 with Nikon FC‐E9 (mm)
ValuermsSignificanceValuermsSignificance
ck−2.239500000.001610008−1.690050000.002030008
x00.058860000.0039700080.039870000.002590008
y00.104710000.0039900080.114420000.002400008
A10.000802280.0002740180.001187500.000389196
A2−0.003862400.000073298−0.000678140.000050628
B1−0.001054900.000169038−0.000262270.000125484
B2−0.001374500.0001534780.000212740.000119273
C10.000505620.000327831−0.000095050.000291821
C2−0.000382510.0003098060.000316440.000284581

Table 3.

Calibration parameters calculated for equidistant model.

The results of bundle block adjustment calculation by the software for 13 images captured with iPhone 4S camera and Olloclip 3 in one are given, respectively; the sigma0 value is 0.00099 and the pixel size is 0.0022 mm. The results of bundle block adjustment calculations by the software for 13 images captured with Nikon Coolpix8700 camera and FC‐09 fisheye lens are given, respectively; the sigma0 value is 0.00163 and the pixel size is 0.0022 mm. Sigma0 is the root mean square (rms) of measurements for image co‐ordinates after bundle block adjustment. Table 2 shows the resulting values in both of the calibrations. According to software’s post‐adjustment outputs, additionally, projection point parameters (X0, Y0, Z0 and omega, phi, kappa) of the 13 images and object points’ significance for both of the camera and lens combination are 8, which is 99.9%. (Software’ significance values are 1: no significance, 2: 80%, 3: 90%, 4: 95%, 5: 98%, 6: 99%, 7: 99.8%, 8: 99.9% high significance) [87].

Figure 9 shows actual positions of three‐dimensional co‐ordinates of calibration field obtained by adjustment results and projection points of each of the 13 images that come from both of the calibration files acquired after adjustment.

Figure 9.

Three‐dimensional position of projection points with regard to antenna; three‐dimensional co‐ordinates of the camera as the result of balancing (left image shows values for iPhone 4S camera with Olloclip 3 in one; and right image shows values for Nikon Coolpix8700 camera with Nikon FC‐E9 fisheye lens).

Advertisement

5. Results

At the end of the application designed for testing, numerical values of calibration parameters and rms of those parameters that were calculated according to equidistant model were compared between the ‘Olloclip 3 in one’ fisheye lens used with iPhone 4S mobile phone and standard ‘Nikon FC‐E9’ fisheye lens used in Nikon Coolpix8700. This comparison is given in Table 3 and the resulting graphics are shown in Figures 10 and 11.

Figure 10.

Distortion parameters for two different camera‐fisheye lens combinations.

Figure 11.

Rms values of distortion parameters for two different camera‐fisheye lens combinations.

Since A3 distorsion parameter was a so small value that can be ignored, it was not analysed and written in Table 3 [87]. Table 3 shows that the significance values of the iPhone are higher than that of Nikon because of smaller pixel (it is given in Table 1 in Section 3) structure of iPhone’s camera.

When Figure 11 is examined, it is seen that distortion parameters of iPhone 4S camera and Olloclip 3 in one fisheye lens equipment is larger, although it has a larger focal length. It does not show a significant difference than Nikon Coolpix8700 camera and FC‐09 fisheye lens equipment. (ck: focal length after calibration process, x0: image co‐ordinate of principle point in X direction, y0: image co‐ordinate of principle point in Y direction).

Figures 10 and 11 illustrate that when x0, y0 principal image point co‐ordinate values and focal lengths of two different fisheye lenses are ignored for the cameras with same resolution and same pixel size, a meaningful approximation is obtained. In consideration of these results, current technology developed for Olloclip 3 in one lens that is improved for mobile phones, is particularly great. Nikon FC‐E9 mounted on bulky Nikon Coolpix8700 is difficult to use. Olloclip 3 in one lens mounted on iPhone 4S is considered to be used in studies done with photogrammetric fisheye lens instead of Nikon FC‐E9. The conclusion part compares the advantages and disadvantages of two different fisheye images.

Nine of 112 control points are considered as passing points, while 103 of them are full control points. One hundred and three point co‐ordinates from testing area are considered errorless and used in bundle block adjustment. Three‐dimensional position data derived from the bundle block adjustment are compared to the errorless points. Nikon Coolpix8700 camera and FC‐09 fisheye lens combination give accuracy on 85 points under the sub‐pixel level. IPhone 4S camera and Olloclip 3 in one fisheye lens combination gives accuracy on 89 points under the sub‐pixel level. This means that there is 82.52% accuracy for Nikon Coolpix8700 camera and FC‐09 fisheye lens combination and 86.40% accuracy for iPhone 4S camera and Olloclip 3 in one fisheye lens combination. The co‐ordinates of three‐dimensional object co‐ordinates of 103 points are subtracted from the points that derived from bundle block adjustment process. If the difference values are greater than sub‐pixel level in any axis then they are eliminated consequently. Figures 12 and 13 are depicted from the obtained differences, respectively, for iPhone 4S camera and Olloclip 3 in one fisheye lens combination and Nikon Coolpix8700 camera and FC‐09 fisheye lens combination. Delta X is the difference between three‐dimensional point co‐ordinate that measured before adjustment in X direction and three‐dimensional point co‐ordinate, which derived after adjustment in X direction. Delta Y and delta Z were calculated similarly.

Figure 12.

Subpixel graphic for the combination of iPhone 4S camera with Olloclip 3 in one fisheye lens.

Figure 13.

Subpixel graphic for the combination of Nikon Coolpix8700 camera with Nikon FC‐E9 fisheye lens.

The standard deviations of co‐ordinate differences have been calculated for three different axes from the data contributing to depict Figures 12 and 13. The standard deviation values on the X, Y and Z axes are 0.763, 0.558 and 0.638 mm, respectively, for Olloclip 3 in one lens kit. By using similar derivation, the standard deviation values on the X, Y and Z axes are 0.748, 0.699 and 0.517 mm, respectively, for Nikon FC‐09 lens kit. As presented, the standard deviation values of identical axes are found approximately close to each other from the calculations. Moreover, when the distribution of the co‐ordinate differences were evaluated for three identical axes (X, Y, Z) of these two kinds of lens kits, it was calculated that they have the same maximum difference value which is approximately 2 mm. From these graphics, root mean square error of point positions has been determined as 3.556 mm for Olloclip 3 in one and 3.401 mm for Nikon FC‐09 lens kit. These values show us the internal reliability of these two kinds of fisheye lens kits is similar for three‐dimensional point co‐ordinate determination.

When the difference of image co‐ordinates derived before and after adjustment are analysed in vectorial form, Figures 14 and 15 are achieved for the two different fisheye lenses. If the difference between the measured images co‐ordinates and the adjusted image co‐ordinates are analysed in vectorial form, Figures 14 and 15 are achieved for the two different fisheye lenses. As can be seen from these two figures, residuals decrease while approaching to the image principal point for the two fisheye lens cameras, but increase proportionally to the sides due to distortion. The standard deviation results of the above‐mentioned experiment are 0.000814 for iPhone 4S camera and Olloclip 3 in one fisheye lens combination and 0.000890 for Nikon Coolpix8700 camera and FC‐09 fisheye lens combination. It can be said that according to above explained results, although there is not a significant difference, iPhone 4S camera and Olloclip 3 in one fisheye lens combination has lower distortion on the total image surface than Nikon Coolpix8700 camera and FC‐09 fisheye lens combination. Therefore, iPhone 4S camera and Olloclip 3 in one fisheye lens combination could also be used for photogrammetric applications instead of Nikon Coolpix8700 camera and FC‐09 fisheye lens combination.

Figure 14.

Image co‐ordinates residuals for iPhone 4S camera and Olloclip 3 in one fisheye lens combination (between measured and after adjustment).

Figure 15.

Image co‐ordinates residuals for Nikon Coolpix8700 camera and FC‐09 fisheye lens combination (between measured and after adjustment).

Advantages and disadvantages of using fisheye lenses for the above‐mentioned equipments can be listed as follows:

Advantages

  • IPhone 4S camera and Olloclip 3 in one fisheye lens equipment is lightweight and it is much more easy to use.

  • As given in Table 3 in Section 3, focal distance of iPhone 4S camera and Olloclip 3 in one fisheye lens equipment is larger than focal distance of Nikon Coolpix8700 camera and FC‐09 fisheye lens equipment. The rms of the larger focal distance is smaller than the other one.

  • There is no significant difference between the image centre point co‐ordinates of iPhone 4S camera and Olloclip 3 in one fisheye lens equipment and Nikon Coolpix8700 camera and FC‐09 fisheye lens.

Disadvantages

  • Nikon Coolpix8700 camera and FC‐09 fisheye lens equipment are heavier and much more difficult to use.

  • Since focal distance of iPhone 4S camera and Olloclip 3 in one fisheye equipment is larger than Nikon Coolpix8700 camera and FC‐09 fisheye lens equipment, the resulting distortion parameters are expected to be smaller and when distortion parameters of both equipments are compared to each other, there happens to be a stable result that exceeds the expectations.

  • As given in Table 3 in Section 3, mean square error values for image central point of 4S camera and Olloclip 3 in one fisheye lens equipment are higher than mean square error values of Nikon Coolpix8700 camera and FC‐09 fisheye lens equipment. Therefore, Nikon Coolpix8700 camera and FC‐09 fisheye lens equipment can be considered to be more stable.

  • IPhone 4S camera and Olloclip 3 in one fisheye lens equipment should be tested in a photogrammetric study and the results should be interpreted in the light of these data.

Advertisement

6. Conclusion

The main purpose of this study is to test fisheye lens equipment used with mobile phones. In this study, the performance of Olloclip 3 in one fisheye lens used with iPhone 4S mobile phone and Nikon FC‐E9 fisheye lens used with Nikon Coolpix8700 camera is analysed comparing the calibration results based on an equidistant model. The resolution of the cameras is the same for these two kinds of hardware. The co‐ordinates of image centre point were found approximately close to each other from the calculations for these two kinds of hardware. It was seen that the calibration results of Olloclip 3 in one fisheye lens used with iPhone 4S mobile phone have not showed statistical significant difference results compared to Nikon FC‐E9 fisheye lens used with Nikon Coolpix8700. In addition, it was seen from the results of this study that Olloclip 3 in one fisheye lens has larger focal length than the other. This experimental study shows that Olloclip 3 in one fisheye lens developed for mobile phones has at least the similar characteristics with classic fisheye lenses.

Smartphones and fish eye lens are very popular devices in developing computer technologies. The use of fisheye lenses having big distortion was limited in the past, but today use of advanced computer software ease the solution of distortion problem. Therefore, fisheye lenses became the mostly studied devices and issues in photogrammetric fieldwork. Additionally, the use of fisheye lenses together with smartphones has opened new research areas. The dimensions of fisheye lenses used with smartphones are getting smaller and the prices are reducing. Moreover, as verified in this study, the accuracy of fisheye lenses used in smartphones is better than conventional fisheye lenses. The use of smartphones with fisheye lenses will give the possibility of practical applications to ordinary users in the near future.

Advertisement

Author Biography

Cumhur Sahin was born in 1977. He got his bachelor’s degree in 2001 in Istanbul Technical University in Turkey. He completed his post‐graduation in 2004 with Master of Science in Geodesy and Photogrammetry Engineering in Gebze Institute of Technology in Turkey. He obtained his doctorate in 2011 in Geodesy and Photogrammetry Engineering in Yildiz Technical University in Turkey. He worked as a research assistant in Gebze Institute of Technology from 2001 to 2014 in Turkey. He has been working as a research assistant in Gebze Technical University in Turkey since 2014.

References

  1. 1. iPhone Lens Kits Reviews [Internet]. 2015. Available from: http://iphone-lens-kits-review.toptenreviews.com/ [Accessed: April 21, 2017]
  2. 2. Available from: http://www.mobilephonecameralenses.com/ [Accessed: April 21, 2017]
  3. 3. Olloclip [Internet]. 2015. Available from: https://www.olloclip.com/ [Accessed: April 21, 2017]
  4. 4. Chugh G, Bansal D, Sofat S. Road condition detection using smartphone sensors: A survey. International Journal of Electronic and Electrical Engineering. 2014;7(6):595–602. ISSN 0974‐2174
  5. 5. Perttunen M, Mazhelis O, Cong F, Kauppila M, Leppanen T, Kantola J, Collin J, Pirttikangas S, Haverinen J, Ristaniemi T, Riekki J. Distributed road surface condition monitoring using mobile phones. In: Proceeding of 8th International Conference UIC 2011; September 2–4, 2011; Banff, Canada. ISBN: 978-3-642-23640-2. Springer, Berlin Heidelberg, Germany, 2011. doi:10.1007/978-3-642-23641-9_8
  6. 6. Mohan P, Padmanabhan VN, Ramjee R. Nericell: Rich monitoring of road and traffic conditions using mobile smartphones. In: Proceeding of the 6th ACM conference on Embedded network sensor systems (SenSys‘08); November 5–7, 2008; Raleigh, North Carolina, USA.
  7. 7. Wagner D, Reitmayr G, Mulloni A, Drummond T, Schmalstieg D. Pose tracking from natural features on mobile phones. In: Proceeding of IEEE International Symposium on Mixed and Augmented Reality; September 15–18, 2008; Cambridge, UK
  8. 8. Prisacariu VA, Kahler O, Murray D, Reid I, Simultaneous 3D tracking and reconstruction on a mobile phone. In: Proceeding of IEEE International Symposium on Mixed and Augmented Reality (ISMAR); October 1–4, 2013; Adelaide, SA, Australia. p. 89–98.
  9. 9. Tanskanen P, Kolev K, Meier L, Camposeco F, Saurer O, Pollefeys M. Live Metric 3D reconstruction on mobile phones. In: Proceeding of IEEE International Conference on Computer Vision (ICCV); December 1–8, 2013; Sydney, NSW, Australia. pp. 55–62
  10. 10. Pan Q, Arth C, Reitmayr G, Rosten E, Drummond T. Rapid scene reconstruction on mobile phones from panoramic images. In: Proceeding of IEEE International Symposium on Mixed and Augmented Reality; October 26–29, 2011; Basel, Switzerland. 2011. pp. 55–64
  11. 11. Wagner D, Mulloni A, Langlotz T, Schmalstieg D. Real‐time panoramic mapping and tracking on mobile phones. In: Proceeding of IEEE International Symposium on Virtual Reality; March 20–24, 2010; Waltham, Massachusetts, USA. pp. 211–218
  12. 12. Almazan J, Bergasa LM, Yebes JJ, Barea R, Arroyo R. Full auto‐calibration of a smartphone on board a vehicle using IMU and GPS embedded sensors. In: Proceeding of IEEE International Symposium on Intelligent Vehicles Symposium (IV); June 23–26, 2013; Gold Coast, Australia. pp. 1374–1380
  13. 13. Hill A, MacIntyre B, Gandy M, Davidson B, Rouzati H. KHARMA: An Open KML/HTML Architecture for mobile augmented reality applications. In: Proceeding of IEEE International Symposium on Mixed and Augmented Reality; October 13–16, 2010; Seoul, Korea. pp. 233–234
  14. 14. Kurz D, Benhimane S. Gravity‐aware handheld augmented reality. In: Proceeding of IEEE International Symposium on Mixed and Augmented Reality; October 26–29, 2011; Basel, Switzerland. pp. 111–120
  15. 15. Porzi L, Ricci E, Ciarfuglia TA, Zanin M. Visual‐inertial tracking on android for augmented reality applications. In: Proceeding of IEEE International Symposium on Environmental Energy and Structural Monitoring Systems (EESMS); September 28, 2012; Perugia, Italy
  16. 16. Schindhelm CK, Gschwandtner F, Banholzer M. Usability of apple iPhones for inertial navigation systems. In: Proceeding of IEEE International Symposium on 22nd International Symposium on Personal, Indoor and Mobile Radio Communications; September 11–14, 2011; Toronto, ON, Canada. pp. 1254–1258
  17. 17. Burgess S, Åström K, Lindquist B, Ljungberg R, Sharma K. Indoor localization using smartphones in multi floor environments without prior calibration or added infrastructure. In: Proceeding of International Conference on Indoor Positioning and Indoor Navigation (IPIN); October 13–16, 2015; Banff, Alberta, Canada
  18. 18. Burgess S, Åström K, Högström M, Lindquist B, Ljungberg R. Smartphone positioning in multi-floor environments without calibration or added infrastructure. In: Proceeding of IEEE International Conference on Indoor Positioning and Indoor Navigation, (IPIN); October 4–6, 2016; Madrid, Spain.
  19. 19. Hadid A, Heikkila JY, Silven O, Pietikainen M. Face and eye detection for person authentication in mobile phones. In: Proceeding of IEEE International Symposium on Distributed Smart Cameras, ICDSC ‘07; September 25–28, 2007; Vienna, Austria.
  20. 20. Shen Y, Hu W, Yang M, Wei B, Luce S, Chou CT. Face recognition on smartphones via optimised sparse representation classification. In: Proceeding of IEEE International Conference on Information Processing in Sensor Networks, IPSN‘14; April 15–17, 2014; Berlin, German. p. 237–248.
  21. 21. Wang Y, Tan R, Xing G, Wang J, Tan X, Liu X. Energy‐efficient aquatic environment monitoring using smartphone‐based robots. ACM Transactions on Sensor Networks. 2016;12(3):25. DOI: 10.1145/2932190
  22. 22. Sreelekshmi B, Vidya N. Survey on monitoring aquatic debris using Smartphone‐Based robots. International Journal of Engineering and Computer Science. 2015;58(10):20131–20133. DOI: 10.18535/ijecs/v6i1.50
  23. 23. Maindalkar AA, Ansari SM. Aquatic robot design for water pollutants monitoring. International Journal on Recent and Innovation Trends in Computing and Communication. 2015;3(6):3699–3703. ISSN: 2321‐8169
  24. 24. Muaremi A, Arnrich B, Troster G. Towards Measuring stress with smartphones and wearable devices during workday and sleep. BioNanoScience. 2013;3:172–183. DOI 10.1007/s12668‐013‐0089‐2
  25. 25. Miluzzo E, Wang T, Campbell AT. EyePhone: Activating mobile phones with your eyes. In: Proceeding of the 2nd ACM SIGCOMM Workshop on Networking, Systems, and Applications for Mobile Handhelds, MobiHeld; August 30, 2010; New Delhi, India. DOI: 10.1145/1851322.1851328
  26. 26. Sandstrom GM, Lathia N, Mascolo C, Rentfrow PJ. Putting mood in context: Using smartphones to examine how people feel in different locations. Journal of Research in Personality. 6 pages. Open access, Available online 8 June 2016. http://dx.doi.org/10.1016/j.jrp.2016.06.004
  27. 27. Lane ND, Xu Y, Lu H, Hu S, Choudhury T, Campbell AT, Zhao F. Enabling large-scale human activity Inference on smartphones using community similarity networks (CSN). In: Proceeding of 13th ACM International Conference Ubiquitous Computing, UbiComp11; September 17–21, 2011; Beijing, China. p. 355–364. DOI: 10.1145/2030112.2030160.
  28. 28. Lu H, Yang J, Liu Z, Lane ND, Choudhury T, Campbell AT. The Jigsaw continuous sensing engine for mobile phone applications. In: Proceeding of 8th ACM International Conference on Embedded Networked Sensor Systems, SenSys‘10; November 3–5, 2010; Zurich, Switzerland. DOI: 10.1145/1869983.1869992.
  29. 29. Lu H, Rabbi M, Chittaranjan GT, Frauendorfer D, Mast MS, Campbell AT, Perez DG, Choudhury T. StressSense: detecting stress in unconstrained acoustic environments using smartphones. In: Proceeding of 12th ACM International Conference Ubiquitous Computing, UbiComp ?12; September 5-Sep 8, 2012; Pittsburgh, USA
  30. 30. Lane ND, Miluzzo E, Lu H, Peebles D, Choudhury T, Campbell AT, College D. A Survey of mobile phone sensing. IEEE Communications Magazine, 2010;48(9):140-150. DOI: 10.1109/MCOM.2010.5560598.
  31. 31. Rachuri KR, Musolesi M, Mascolo C, Rentfrow P, Longworth C, Aucinas A. EmotionSense: A mobile phones based adaptive platform for experimental social psychology research. In: Proceeding of 10th ACM International Conference Ubiquitous Computing, UbiComp ?10; September 26-29, 2010; Copenhagen, Denmark. DOI: 10.1145/1864349.1864393
  32. 32. LiKamWa R, Liu Y, Lane ND, Zhong L. Can your smartphone infer your mood? In: Proceeding of Second International Workshop on Sensing Applications on Mobile Phones at ACM, SenSys ‘11; November 1, 2011; Seattle, USA
  33. 33. Ha H, Bok Y, Joo K, Jung J, Kweon IS. Accurate camera calibration robust to defocus using a smartphone. This ICCV paper is the Open Access version. This ICCV paper is the Open Access version, provided by the Computer Vision Foundation. Except for this watermark, it is identical to the version available on IEEE Xplore. In: IEEE International Conference on Computer Vision (ICCV). 2015. pp. 828–836. DOI: 10.1109/ICCV.2015.101
  34. 34. Gruen A, Akca D. Calibration and accuracy testing of mobile phone cameras. In: Proceeding of the 28th Asian Conference on Remote Sensing, ACRS ‘07; November 12–16, 2007; Kuala Lumpur, Malaysia. pp. 1171–1177
  35. 35. Delaunoy A, Li J, Jacquet B, Pollefeys M. Two Cameras and a Screen: How to Calibrate Mobile Devices? In: Proceeding of 2nd IEEE International Conference on 3D Vision; December 8–11, 2014; Tokyo, Japon. p.123–130
  36. 36. Saponaro S, Kambhamettu C. Towards auto‐calibration of smart phones using orientation sensors. In: Proceeding of IEEE Conference on Computer Vision and Pattern Recognition Workshops. June 23–28, 2013; Portland, Oregon, USA. pp. 20–26.
  37. 37. Ahn H, Choi C, Yeu Y. Assessment of smartphone camera in close‐range digital photogrammetry. In: Proceeding of 34th Asian Conference on Remote Sensing, ACRS ‘13; October 20–24, 2013; Bali, Indonesia. pp. 1630–1634
  38. 38. Abraham S, Forstner W. Fish‐eye‐stereo calibration and epipolar rectification. ISPRS Journal of Photogrammetry and Remote Sensing. 2005;59(2):278–288
  39. 39. Gledhill D, Tian GT, Taylor D, Clarke D. Panoramic imaging—a review. Computers & Graphics. 2003;27(3):435–445
  40. 40. Schwalbe, E. Geometric modelling and calibration of fisheye lens camera systems. In: Proceedings of the 2nd ISPRS Panoramic Photogrammetry Workshop; February 24–25, 2005; Berlin, Germany. p. 5–8.
  41. 41. Kumler J, Bauer M. Fisheye lens designs and their relative performance. In: Proceedings of the Current Developments in Lens Design and Optical Systems Engineering, Vol. 4093. SPIE ‘00; October 24, 2000; San Diego, Calif, USA. pp. 360–369
  42. 42. Dhane P, Kutty K, Bangadkar S. A generic non‐linear method for fisheye correction. International Journal of Computer Applications. 2012;51(10):58–65
  43. 43. Yamashita M, Yoshimura M, Nakashizuka T. Cloud cover estimation using multitemporal hemisphere imageries. In: Proceedings of the Photogrammetry and Remote Sensing, 20th ISPRS Congress, ISPRS ‘04; July 12–23, 2004; Istanbul, Turkey. pp. 826–829
  44. 44. Wang L, Duan F, Lv K. Fisheye-lens-based visual sun compass for perception of spatial orientation. Mathematical Problems in Engineering, 2012;2012(460430):15pages. http://dx.doi.org/10.1155/2012/460430.
  45. 45. Oh JKW, Haberl JS. New educational software for teaching the sunpath diagram and shading mask protractor. In: Proceedings of Conference on the 5th International Building Performance Simulation Association, IBPSA; September 8–10, 1997; Prague, Czech Republic. 7
  46. 46. Beekmans C, Schneider J, Läbe T, Lennefer M, Stachniss C, Simmer C. Cloud photogrammetry with dense stereo for fisheye cameras. Atmospheric Chemistry and Physics. 2016;16:14231–14248. DOI: 10.5194/acp‐16‐14231‐2016
  47. 47. Krishnan G, Nayar SK. Cata‐fisheye camera for panoramic imaging. In: Proceedings of the IEEE Workshop on Applications of Computer Vision; January 7–8, 2008; Copper Mountain, Colo, USA. pp. 1–8.
  48. 48. Streckel B, Evers‐Senne JF, Koch R. Lens model selection for a markerless AR tracking system. In: Proceedings of the 4th IEEE and ACM International Symposium on Mixed and Augmented Reality, October 22–25, 2005; Vienna, Austria. pp. 130–133
  49. 49. Brun X, Deschaud JE, Goulette F. On-the-way city mobile mapping using laser range scanner and fisheye camera. In: Proceedings of the 5th ISPRS International Symposium on Mobile Mapping Technology, MMT ?07; May 29–31, 2007; Padua, Italy, 8 pages.
  50. 50. Yamamoto D, Ozeki S, Takahashi N. Wired fisheye lens: a motion-based improved fisheye interface for mobile web map services. In Web and Wireless Geographical Information Systems. Springer, ISSN: 0302-9743. In: Proceedings of 9th International Symposium, W2GIS 2009; December 7–8, 2009; Maynooth, Ireland.
  51. 51. Ahmad A, Lima P. Multi‐robot cooperative spherical object tracking in 3D space based on particle filters. Robotics and Autonomous Systems, 2013;61(10):1084–1093
  52. 52. Zheng JY, Li S. Employing a fish‐eye for scene tunnel scanning. In: Proceedings of the 7th Asian Conference on Computer Vision, ACCV ‘06; January 13–16, 2006; Hyderabad, India. pp. 509–518
  53. 53. Martinez E, Del Pobil AP. A dioptric stereo system for robust real‐time people tracking. In: Proceeding of IEEE International Conference on Robotics and Automation (ICRA ‘09); May 12–17, 2009; Kobe, Japan
  54. 54. Gurtner A. Investigsation of fisheye lenses for small UAV aerial photography [thesis]. Brisbane, Australia: Queensland University of Technology; 2008
  55. 55. Grelsson B. Global pose estimation from aerial images registration with elevation models [thesis]. Swedish: Linköping University Electronic Press, Linköpings Universitet, Linköping; 2014.
  56. 56. Naruse T, Kaneko T, Yamashita A, Asama H. 3‐D measurement of objects in water using fish‐eye stereo camera. In: Proceeding of the 19th IEEE International Conference on Image Processing (ICIP ‘12); September 30–October 3, 2012; Orlando, Fla, USA. 2012. pp. 2773–2776
  57. 57. Bouaziz S. Geolocalization using skylines from omni‐images [thesis]. Lausanne, Switzerland: Virtual Reality Laboratory,’Ecole Polytechnique F’ed’erale de Lausanne; 2009
  58. 58. Georgantas A, Br’edif M, Desseilligny MP. An accuracy assessment of automated photogrammetric techniques for 3D modelling of complex interiors. In: Proceeding of 22th the ISPRS Congress. Vol. 39‐B3. August 25–September 1, 2012; Melbourne, Australia. 2012:23–28
  59. 59. Stehle T, Truhn D, Aach T, Trautwein C, Tischendorf J. Camera calibration for fish‐eye lenses in endoscopywith an application to 3Dd reconstruction. In: Proceedings of IEEE International Symposium on Biomedical Imaging (ISBI); April 12–15, 2007; Washington, D.C. pp. 1176–1179
  60. 60. Hu P, Shen G, Li L, Lu D. ViRi: View it Right. In: Proceedings of the 11th ACM Annual International Conference on Mobile Systems, Applications, and Services, MobiSys‘13; June 25–28, 2013; Taipei, Taiwan.
  61. 61. Schneider D, Schwalbe E. Integrated processing of terrestrial laser scanner data and fisheye-camera image data. In: Proceedings of the 21st ISPRS Congres, Vol. XXXVII Part B5; July 3–11, 2008; Beijing, China. p. 1037–1044
  62. 62. Hughes C, Denny P, Jones E, Glavin M. Accuracy of fisheye lens models. Applied Optics. 2010;49(17):33383347
  63. 63. Junior JM, Moraes MVA, Tommaselli AMG. Experimental assessment of techniques for fisheye camera calibration. Boletim de Ciências Geod’esicas. 2015;21(3):637–651
  64. 64. Arfaoui A, Thibault S. Fisheye lens calibration using virtual grid. Applied Optics. 2013;52(12):2577–2583
  65. 65. Kim D, Paik J. Three‐dimensional simulation method of fish‐eye lens distortion for a vehicle backup rear‐view camera. Journal of the Optical Society of America A, 2015;32(7):1337–1343
  66. 66. Torii A, Havlena M, Pajdla T. Omnidirectional image stabilization for visual object recognition. International Journal of Computer Vision. 2011;91:157–174. DOI: 10.1007/s11263‐010‐0350‐x
  67. 67. Schneider D, Schwalbe E, Maas H‐G. Validation of geometric models for fisheye lenses. ISPRS Journal of Photogrammetry and Remote Sensing. 2009;64(3):259–266
  68. 68. Kannala J, Brandt SS. A generic camera model and calibration method for conventional, wide‐angle, and fish‐eye lenses. IEEE Transactions on Pattern Analysis and Machine Intelligence. 2006;28(8):1335–1340
  69. 69. Parian JA. Panoramic imaging: Techniques, sensor modeling and applications. In: Proceedings of the International Summer School on Digital Recording and 3D Modelling. April 2006; Crete, Greece
  70. 70. Zhu H, Wang X, Zhou J, Wang X. Approximate model of fisheye camera based on the optical refraction. Multimedia Tools and Applications. 2014;73:1445–1457. DOI 10.1007/s11042‐013‐1641‐3
  71. 71. Micusik B, Pajdla T. Estimation of omnidirectional camera model from epipolar geometry. In: Proceeding of IEEE Computer Vision and Pattern Recognition, June 16–22, 2003; Madison, Wisconsin, USA. pp. 485–490
  72. 72. Thirthala S, Pollefeys M. The radial trifocal tensor: A tool for calibrating the radial distortion of wide‐angle cameras. In: Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition CVPR ‘05. Vol. 1. June 20–25, 2005; San Diego, California, USA. pp. 321–328
  73. 73. Hughes C, Glavin M, Jones E. Simple fish‐eye calibration method with accuracy evaluation. Electronic Letters on Computer Vision and Image Analysis. 2011;10(1):54–62
  74. 74. Friel M, Hughes C, Denny P, Jones E, Glavin M. Automatic calibration of fish‐eye cameras from automotive video sequences. Intelligent Transport Systems. 2009;4(2):136–148. DOI: 10.1049/iet‐its.2009.0052
  75. 75. Schmid HH. General Analytical Solution to the Problem of Photogrammetry. Aberdeen: US Ballistic Research Laboratories; 1959. Accession Number: AD0230349.
  76. 76. Mikhail EM, Bethel JS, McGlone JC. Introduction to Modern Photogrammetry. New York: John Wiley & Sons; 2001
  77. 77. Hou W, Ding M, Qin N, Lai X. Digital deformation model for fisheye image rectification. Optics Express. 2012;20(20):22252–22261
  78. 78. Johnson KB. Development of a versatile wide‐angle lens characterization strategy for use in the OMNIster stereo vision system [thesis]. USA: University of Tennessee, Knoxville, Tenn; 1997
  79. 79. Miyamoto K. Fish eye lens. Journal of the Optical Society of America. 1964;54(8):1060–1061
  80. 80. Hughes C, Denny P, Glavin M, Jones E. Equidistant fisheye calibration and rectification by vanishing point extraction. IEEE Transactions on Pattern Analysis and Machine Intelligence. 2010;32(12):2289–2296
  81. 81. Hughes C, McFeely R, Denny P, Glavin M, Jones E. Equidistant (fθ) fish‐eye perspective with application in distortion centre estimation. Image and Vision Computing. 2010;28(3):538–551
  82. 82. Parhizgar S. An evaluation of an omni‐directional (fısheye) vision sensor for estimating the pose of an object [thesis]. Saskatchewan, Canada: University of Regina, Master of Applied Science in Software Systems Engineering University of Regina; 2014
  83. 83. Nikon [Internet]. 2015. Available from: http://imaging.nikon.com/ [Accessed: 21 April 2017]
  84. 84. Image Resource [Internet]. 2015. Available from: http://www.imaging-resource.com/. [Accessed: 2017‐04‐21]
  85. 85. Photo.net. [Internet]. 2015. Available from: http://photo.net/ [Accessed: 2017‐04‐21]
  86. 86. Sahin C. A construction of camera calibration field for close range photogrammetric camera calibration [thesis]. Kocaeli, Turkey: Gebze Institute of Technology, Graduate School of Science Engineering and Sciences; 2004
  87. 87. Sahin C. Comparison and calibration of mobile phone fisheye lens and regular fisheye lens via equidistant model. Journal of Sensors. 2016;2016(9379203):11 pages. http://dx.doi.org/10.1155/2016/9379203.

Written By

Cumhur Sahin

Submitted: 16 December 2016 Reviewed: 10 May 2017 Published: 02 November 2017