Skip to main content

Fully automating fine-optics manufacture - why so tough, and what are we doing?

Abstract

Precision and ultra-precision surfaces are crucial for many products – quality optics, joint & cranial implants, turbine blades, and industrial moulds & dies, to name a few. Automation in this context is distinct from standard procedures in industry, where the identical sequence of operations can be repeated over and over again. Ultraprecision tolerances may be tens to hundreds of times tighter, and this is compounded by the hundreds of diverse substrate materials in use. Even with modern computer numerically controlled (CNC) machines, skilled craftspeople are needed to plan a process-chain for a new material or geometry. Processes working at these tight tolerances, fall short of being fully-deterministic, so repeated process-metrology iterations are required. Surface-correction loops may be automated, but expert assessment should be performed at each step to check for unexpected anomalies. The ultimate goal of importing a part, processing autonomously, and delivering a finished part to an “optical” specification with no human intervention, is still a long way off. This paper describes the challenge and why it is important. It then melds together process-monitoring, psychology, artificial intelligence and robotics, to take a far-sighted view of how the ultimate goal can be realised.

Introduction

Components with tolerances in the micron-regime are routinely mass-produced, from raw metal to finished part, on fully-automated production lines. The auto-sector is an obvious example. This paper considers why this is not the case for ultra-precision surfaces, including and especially optics, and what new methodologies could be applied to remedy this.

The global market for photonics products for 2015 was €447B, with growth of 6% [1], and the medical imaging market alone was US$989 M by 2015, growth 12% [2]. New large-scale applications of freeform optics are on the horizon, such as advanced lighting and imaging systems for autonomous vehicles. Meanwhile, optical components do not represent the only application for complex, precise surfaces. For example, the knee and hip joint replacement market is likely to face a revolution, with bespoke additively-manufactured implants which need precision finishing. Many high-precision moulds and dies are still finished by hand-fettling. Overall, we see significant market growth, but confronted by the increasing difficulty in recruiting highly-skilled individuals with the relevant ‘hands-on’ experience, as a generation of such experts is retiring. With that in mind, we consider how ultraprecision components are produced, and how this could potentially be automated.

The ultra-precision manufacturing process-chain generally starts with a blank of material from a supplier, either pre-ground to near-net form, or as a plane-parallel slab, which may have been cast, sawn or milled, depending on material. Ductile materials, such as soft metals and plastics, can be single-point diamond-turned directly, providing nanometre (nm) surface textures, but with repetitive turning marks and form-errors superimposed. Non rotationally-symmetric forms can be created by servoing tool-infeed, synchronised with part-rotation.

For brittle materials such as glasses, crystals, ceramics and hard metals (the main purpose of this paper), the traditional route is to remove bulk material to produce the overall geometry and basic forms of the functional surfaces, using a classical curve-generator, or more usually today, a CNC grinding machine. Depending on the grinding-quality, the functional surfaces may require refining (‘lapping’ or ‘smoothing’) before polishing, using a progression of finer abrasives. From our experience, a state-of-the-art CNC grinding machine, working hard, brittle materials, may deliver ~ 10 μm of sub-surface damage (SSD) with a few microns RMS form error, depending on material, size and geometry of the part. One such machine is the Cranfield BoX™ machine [3]. Subsequent processing using CNC free-abrasive polishing [4] and similar techniques such as ‘grolishing’ [5], aims to remove the SSD layer, correct form errors, and refine texture to 1-2 nm Sa for most optics, but down to 01–0.2 nm Sa in critical applications. Other requirements may make optical fabrication particularly demanding, such as precise control of edges [6, 7].

In this paper, we first consider why autonomous manufacturing of ultraprecision surfaces is currently not possible, and then focus on the role of expert-machine operators. We report on an experiment to capture such expertise through video and audio recordings, and present part of a comprehensive flow chart so generated. We then consider the role of digital process-monitoring, and report on development of a bespoke force-measuring fixture. Together, such methods provide feedstock information for artificial intelligence techniques, which we describe. We summarise our practical work on physical implementation of a robotic cell. Finally, we draw the threads together in a diagrammatic representation of the work-packages needed to deliver an autonomous cell, and the status of current work.

Methods

Aim – the challenge of autonomous manufacturing of high-precision surfaces

Our ultimate goal is ‘bespoke mass production’ – mass-produced parts that may be different, at no extra cost to making them all-the-same. Such a Manufacturing Cell would accept specifications for diverse parts, and be supplied with blanks from the material suppliers. It would then output ultra-precision parts, without manual intervention. In working towards this goal, we are confronted by currently-insurmountable obstacles, as outlined below:-

  1. 1.

    How to start. We consider the diversity of materials commonly in use, embracing a very wide range of mechanical, thermal and chemical properties; ductile to brittle, inert to chemically-active, and zero-expansion to high expansion-coefficient. For example, Schott produces over 120 optical glass types; there are other specialist glass manufacturers, numerous crystalline materials particularly for infrared wavelengths, ceramics for engineering and mirror-substrates, hard alloys such as cobalt chrome, and soft materials such as aluminium. Even with state-of-the-art machines, expertise of skilled operators is required to configure the optimum process-chain (CNC grinding or diamond-turning parameters, then abrasives, pads, pressures, speeds and feeds etc. for CNC polishing and allied processes and, of course, metrology).

  1. 2.

    Fundamental complexity of polishing at the molecular scale, with imperfectly-understood interplay of brittle fracture, chemical attack, and plastic flow (depending on material). The baseline widely adopted is Preston’s empirical equation [8]:-

$$ \frac{\mathrm{dz}}{\mathrm{dt}}={\mathrm{k}}_{\mathrm{p}}\mathrm{PV} $$
(1)

where z is the layer-thickness removed in a polishing time t, kp is the empirical Preston coefficient, P the applied pressure, and V the relative speed between tool and part.

There is a host of literature related to the fundamental mechanisms of polishing, much in the context of semiconductor wafers and dielectric films being polished face-down on a large polishing lap, and most related to flat surfaces. The nominal.

contact-area is then that of the part. In corrective polishing of aspheric or free-from parts, a tool smaller than the part usually faces down onto the face-up part, and the relevant area is that of the tool.

Maury et al. [9] note that Preston’s Equation is not strictly observed in practice, and the authors identify two regimes, depending on the product of tool pressure P and velocity V. They conclude that, for large values of PV, the relationship is linear, but with a non-zero intercept; for small PV values the slope is higher, but with zero intercept. They attribute this behaviour to variations in polishing efficiency, which is reduced for high PV values due to the centrifugal action and high down-forces tending to squeeze slurry out of the working interface. Hocheng et al. [10] present a detailed physical and chemical model based on individual abrasive material-removal in a flow-field. When the part’s surface is hydrated in polishing and a slurry particle binds to the resulting Si (OH)4 molecule, and removes that molecule, the shear forces must be larger than the binding force of the molecule. Their model considers that abrasives then remove material by a bear-and-shear process, while slurry-flow through the interface between part and tool plays a role in transporting the chemical component of removal. They conclude that, “Preston’s equation can be properly interpreted and used with modification”, as follows:-

$$ M\infty f\;\left(n\;{D}_p^2\;{D}_m^2/12\pi \gamma \right)\;{\left(\upmu \mathrm{A}\right)}^{1/2}\;{\left(\mathrm{PV}\right)}^{1/2} $$
(2)

where M is the volumetric removal rate, f the abrasive encounter frequency, n is a “number greater than 100” (presumably reflecting the statistical nature of polishing), Dp is the diameter of a slurry particle, Dm the diameter of an Si (OH)4 molecule, γ is the surface energy between the surface and sublayer molecule, μ the viscosity of the slurry film, A the area in contact, and P and V the polishing relative-velocity and pressure respectively. Maury et al. [9] derive a V1/2 theoretical relationship, distinct from Preston’s V1.0, and then compare their theoretical result with their empirical data which indicates V0.65. They attribute this discrepancy to a portion of the contact area between part and tool operating in the direct contact mode, as a result of a “not ideally conformed pad”. This is important, as pads wear in practice, and from [9] this would be expected to cause a drift in the exponent of V in Eq. 2 within the range 0.5–1.0 during polishing. Similarly the term f in Eq. 2 is related to slurry concentration, which directly impacts removal rate. This can be affected by settlement on a face-up part or water-evaporation (both enhancing removal), or within the machine (reducing removal). We have observed these phenomena in CNC polishing.

Note that Preston’s Equation is framed in terms of layer-thickness removed, and pressure exerted. Dividing by contact-area A gives the volumetric removal rate M in terms of the polishing-force F:-

$$ \mathrm{M}={\mathrm{k}}_{\mathrm{p}}\;\mathrm{FV} $$
(3)

Interestingly, M is independent of the diameter of the contact zone (e.g. tool-size) for constant force (or constant weight with a gravity-loaded tool). In contrast, Eq. 2 demonstrates a square-root dependence on A for constant force, or a linear dependence on diameter. Maury’s comment regarding shear force can also be linked to the role of lateral friction. Pal et al. [11] reinforce this, stating that friction at the workpiece surface allows the abrasive particles to remove the hydrated layer. They then describe the observed variability of the coefficient of friction, stabilising during a period after polishing starts, with the stabilised value depending on load and relative velocity. They identify three removal regimes, depending on the relationship between part, pad and slurry:- Contact Mode, Hydroplaning Mode, and Mixed Mode.

From the above papers, and numerous references cited therein, we see a highly complex interplay of chemical and mechanical removal mechanisms, with transitioning between removal-regimes with changing conditions. The hydrodynamics at the polishing interface is clearly complex even for flat parts, given the real case of imperfectly-flat (at the molecular level) tools and parts. With aspheric and freeform surfaces, this is greatly complicated, with tool-miss-fit by hard tools potentially introducing variability in the contact-modes identified in [10, 11]. This can be mitigated using compliant tools, or in the Precessions™ process, using an inflated spherical bonnet [4]. Volumetric removal rate in Eq. 2 also depends on the diameter of the slurry particles, which itself is a distribution almost always unknown in detail, and varying with time during polishing as particles can break down, or if polishing is interrupted, agglomerate.

Given the above complexity, the typical highly-skilled polishing operator makes process-decisions, based not on an in-depth understanding of the fundamental physics and chemistry, but from years of hard-won practical experience.

  1. 3.

    Imperfect determinism of polishing. Given the above, computer-controlled corrective polishing usually aims to maintain the basic polishing parameters constant throughput a polishing run. Usually, a tool much smaller than the part will be used, which follows a pre-determined tool-path. Dwell-time modulation is the preferred way to correct form-errors, usually executed by changing the traverse-speed along the tool-path. However, this presupposes a linear relationship, whereas the change in velocity-vector (of tool-rotation plus traverse-speed) may well disturb the hydrodynamics of the removal mechanism. In practice, when a corrective algorithm commands a removal-distribution over the surface, the result will always fall short of perfect conformance. 80% convergence (20% error) is considered excellent; often it can be inferior. Surfaces sometimes regress at a particularly step, or unexpected artefacts appear, for no known reason, requiring expert diagnosis and remedy. This is clearly related to the fundamental complexity above, variability in parameters not controlled, and potential transitioning between removal modes. The impact is the iterative nature of correcting measured surface form-errors i.e. repeated cycles of process and metrology. Whilst the corrective algorithm itself may be executed automatically to generate a dwell-time map, the wise operator will inspect the metrology data and the part at each step looking for any anomalies, and may decide to take remedial action by adjusting the process parameters in some way.

  1. 4.

    Surface inaccessibility during polishing – abrasive slurries obscure the surface during processing, preventing the skilled operator from conducting metrology in-process.

  1. 5.

    When to stop processing. If the part is, say, 95% compliant with some aspect of the specification, is the 5% discrepancy within the uncertainty of measurement, or should another corrective cycle be performed (risking regression of form, or new artefacts), or should a concession be sought from the customer or end-user? Today, this is in the judgement of either a skilled operator, or management.

Methodology – practical experience capturing expertise of skilled operators

A core issue is then the role of skilled operators, i) to define initial process parameters, ii) decision-making at each iterative step, and iii) assessing “final” results in comparison with specification and deciding on closure or otherwise. With this in view, we have explored ways to capture know-how of expert operators, and identified the (not insignificant) difficulties. We have then examined the practical steps that could be taken to improve determinism of processes. The third and complementary aspect is the engineering and software required to automate process-flow in practice.

Compared with other CNC manufacturing processes in common use, producing ultra-precision surfaces presents a very different case in regard to its dependence on the expertise of skilled operators. Given a new material or geometry, this may involve selecting tools, pads, fixturing, polishing slurries and additives, machine configuration, tool-paths, feed-rates, tool-rotation speeds, and more besides. Then, the skilled operator will assess intermediate metrology data, and decide whether simply to conduct another iterative correction based on the error map and (for example) dwell-time moderation. Alternatively, if some unexpected anomaly is discovered, the operator may decide to change some process parameters accordingly.

With this in mind, we have considered whether it would be feasible to capture this type of know-how in a way that could be used in an automated Cell as part of a model-based approach introduced in “The method of case based reasoning applied to autonomous manufacturing” section. The proof-of-concept experiment we conducted was to record CNC and metrology data, together with video footage and audio. Video was acquired by a web-camera attached to the machine operator’s safety helmet. Audio was recorded using a lapel microphone, and the operator was encouraged to follow the, ‘think aloud protocol’ i.e. voicing thoughts aloud whilst working, capturing both the logic of process-decisions, and the resulting practical actions.

Individuals who participated in the experiment gave their informed consent, and the partner company gave consent at Managing Director level.

We selected four highly-skilled hand-polishers and operators of CNC polishing machines and metrology instrumentation. Two of these worked for a research organisation and two for an associated optical manufacturing company. Two main difficulties were encountered in regard to the company workers:-

  1. 1.

    Initially, both company-workers proved unwilling for any involvement in the project at any level, even with support of their management, and assurances of anonymity. One of these subsequently agreed to limited participation; the other did not.

  2. 2.

    Before any videoing, it was agreed that the recordings would be viewed by only the research team, and would be stored securely. Issues of commercial security led the participating company to require that human faces, tooling, customer components or metrology equipment in the general area must be masked in the videos. This was challenging and consumed significant resource.

Meanwhile, the two research operators were fully cooperative throughout. In total, some 5 h of useful video and audio was acquired. Comprehensive flow-charts showing all process-steps and decision-points were then derived from the recorded information. These flow charts are too complex for presentation in this paper, having over 40 logical functions, with decision-making and branching. An enlargement of a small part of one such flow-chart is shown for illustration in Fig. 1.

Fig. 1
figure 1

Detail extracted from a comprehensive process flow-chart derived from recorded video and ‘think aloud’ audio

One important aspect of this, is that the flow chart has captured the logical sequence of actions and decisions performed by a skilled operator, when the operator may well have not been consciously fully-aware of this sequence. In this regard, the technique is potentially powerful in capturing expert knowledge. Nevertheless, the experience also confirmed our preconception that skilled operators in a commercial setting would be secretive and possessive of their knowledge. In a research setting, this appears not to be the case to anything like the same extent, and this may provide the most useful avenue to pursue this type of approach in the future.

The potential role of digital monitoring of process variables

Given the above experience, the best approach appears to be to focus capturing operator-expertise on aspects that cannot be acquired any other way. An anecdotal example we have experienced is adding a particular brand of vinegar to acidify aluminium oxide slurries for polishing electroless nickel coatings. Then, in addition, it is necessary automatically to record and associate together in a data-base the widest practical set of process-parameters that can be digitally-monitored and time-stamped. Examples include the following:-

  1. 1.

    Full design specification of the part – geometry and functional surfaces with tolerances

  2. 2.

    Physical, chemical and thermal data for the substrate

  3. 3.

    Composition of abrasive media, coolants etc., including any special additives

  4. 4.

    Time-stamped in-process monitoring of abrasive slurry specific gravity, temperature, pH, and preferable particle size distribution. Slurry expelled from the part-tool interface should be sampled as this is a direct representation of the process, rather than slurry in a remote tank which may have settled.

  5. 5.

    Time-stamped in-process monitoring of dynamic variables such as normal and lateral components of process-forces, process-torques, synchronised with tool positions, speeds and accelerations derived from the CNC controller

  6. 6.

    QR (Quick Response) codes on all tooling, pads and fixtures, with QR readers

  7. 7.

    Metrology data

As mentioned above, it is unfortunate that a surface undergoing processing cannot be measured directly in real-time, as it is obscured by abrasive slurry or coolant. In effect, the process is a ‘closed box’ – process-parameters are input to the box, and sometime later, the part delivered by the box is then measured. We identify two main avenues for applying AI techniques. First is to collect large sets of corresponding part, process and metrology data, and seek relationships between them using AI operating in batch-mode. Second, there are possibilities for ‘opening the box’, by gathering indirect information that allows some prediction of the real-time evolution of the surface. The proposed methodology involves monitoring specific real-time process-variables that point to instantaneous volumetric removal rate, then integrating these rates to give the instantaneous surface form. In principle, the predicted form can then be compared with expectation, and process variables modified accordingly ‘on-the-fly’. Once again, AI seeks for input/output relationships, but is now required to operate in real-time, and execution-speed may be the limiting factor. We have discusses this with the high power computing (HPC) community, to investigate whether resource could be available on a ‘pay as you go’ model, and this indeed looks possible.

Process forces and torques are particularly critical process-variables, and may provide useful information to compensate for tool-wear and other effects including variability of slurry parameters (concentration, particle size distribution and pH). We have completed the detailed design and dynamic finite element analysis of a specialised fixture for the part which, at the time of writing, is out for quotation. Drawing on the importance of frictional coupling in polishing [11], the new fixture is designed to measure process-forces in three orthogonal directions, together with torques. First resonant frequency is adequate to resolve ‘tramping’ effects from asymmetries in a rotating tool. We plan to deploy this in CNC polishing to gather real-time data, as above.

Application of autonomous intelligent systems/agents to autonomous manufacturing

We postulate that to make progress we should consider the Manufacturing Cell as an Autonomous Intelligent System (AIS). Currently, there is an unprecedented global interest in the potential of Artificial Intelligence and the applications of systems autonomy based on intelligent software [12, 13]. Related applications include the autonomous planning-based control of spacecraft [14], the oil well drilling process [15], and urban traffic signal strategies [16]. These applications share similar challenges with autonomous manufacture of optics, in particular the wide range of starting conditions, the complexity of process, the inherent uncertainty and imperfect determinism.

AIS are systems that function in dynamic and unpredictable environments, making decisions and carrying out actions in response to demands from users, and backed up by sensed data from the environment. While there are a range of architectural choices for AIS, engineering the software that provides their overall management (their ‘brain’) is performed in a combination of two methods that can be characterised as either “data-driven”, or “model-based”. The former denotes those techniques such as neural networks, genetic programming, and evolutionary computing which require very large data sets and training sessions to learn the required function. A specific example is the family of techniques called ‘Deep Learning’ that have proved very successful in helping implement the object recognition part of autonomous vehicles, as well as being adept at natural language processing [17]. Data-driven approaches are particularly suited to system evolution and adaptation, as they can refine automatically with upgraded training data, and generalise their application with new training sets.

Model-driven systems, in contrast, are those that have been explicitly engineered in a similar fashion to traditional software artefacts, and are employed in the Space [10] and Traffic Management [16] applications introduced above. Knowledge of the AIS’s environment, the AIS’s goals, and the AIS’s abilities to effect the world (its actions) are stated as data structures within the application. Model-driven systems may be more difficult to create and evolve than data-driven equivalents, but can more easily be adapted to give explanations of their behaviour, and can be embedded with established explicit knowledge and procedural skills. Methods for engineering model-based systems are more mature than those for data-driven systems, and important processes such as validation and verification are more straightforward.

Different AIS, and indeed different parts of an AIS, may be implemented with one method or another. For example, a space satellite may need to plan operations in space, and this planning may depend on the laws of motion. It would make no sense to engineer such a system by it learning those laws from data - they are already well known, hence they should be engineered (stored as explicit knowledge) within the application. On the other hand, if the satellite needs to identify planets, then a sensible method to embody this function might be to use large amounts of previously classified photographs of planets, and employ a data-driven method such as neural networks to implement a recognition function for identifying planets.

The proposed autonomous manufacturing cell will require input of a requirements-specification for the finished part it is to create, and a blank of a specified material, and will produce as output a finished product to meet the specification. From the discussion above, it is apparent that the cell will embody methods from both model-based and data-driven autonomous system manufacture. This will require as input, for a specific task, data and knowledge extracted from the specification of the part to be produced.

For any task the Cell software will require access to knowledge about the mechanical, thermal and chemical properties of the wide range of materials that need to be used within the task - in particular scientific knowledge and manufacturers’ data regarding the materials. This, together with invariant knowledge of the process itself can be encoded into a model, including the likely effects of particular machining tools.

There will be historical accounts of many past successful attempts to create a part similar to the one at hand which can be utilised by a data driven method to adopt an old or adjust a new tool configuration. For example, past cases indexed by parameter values of a particular run of a machine will be available to be used by a Case-Based Reasoning approach, as discussed below.

Hence, the Cell will employ a range of methods, both at the model, and data-driven levels. The principle challenge is to generate the process details of work-flow utilising the components of the proposed Cell. To do this from first principles would require an intelligent agent to be able to reason with time, resources, actions, processes, events etc. in a dynamic and unpredictable world. In practice, for the synthesis of a process step with complex parameters, preconditions and effects, it is envisaged that a Case-Based approach would be appropriate, which will use the memorised job-histories and developed knowledge base. For the setup of an initial multi step process, prediction regarding accumulated effects, and an analysis of metrology data between in-process steps, it is envisaged that we will use a causal model relying on process simulation. Additionally, the Cell should possess enough knowledge and reasoning power to be able to generate an explanation of its behaviour, and be able to construct a process or parts of a process where a previous one did not exist.

In operation, the Cell will employ the generated process description as a template for producing and executing a dynamic schedule, given the actual availability / condition of the tools available to the cell. During execution, it will support decision making in response to in-process real-time monitoring and measurement data collected between process-steps (e.g. checking to see whether the conditions of a surface is what is expected using the 3D map information). The operational level will also be required to perform recovery from unexpected process events, and may have to re- invoke process plan generation if the job is not advancing as expected.

The method of case based reasoning applied to autonomous manufacturing

In the current stage of our research, we are focused on Case-based reasoning (CBR) to be used to automate the polishing process. CBR enables the re-use of concrete relevant experience from the past, when dealing with a new task/problem [18]. Problems that were solved in the past and their solutions are memorised as cases in a case base. To solve a new problem, CBR system employs a defined similarity measure to be used to retrieve a case (or cases) from the case-base, which is (are) the most similar to the new problem. In order to address the differences that may exist between the new problem and the retrieved one, an adaption of the retrieved solution to the problem needs to take place.

In our study of automation of polishing process we start with corrective polishing process. A key question to address is in which of its steps do the specialists particularly use their skills/knowledge? Common steps in grinding and corrective polishing using CNC machines are identified in Fig. 2, where the upper row of boxes represents automated operations, and the grey boxes in the lower row show examples of manual interventions. Cleaning the part between each step is required, but for clarity not shown.

Fig. 2
figure 2

Examples of common steps in CNC corrective polishing

A CBR system is assigned to each of the process steps. Each case contains specific knowledge/expertise applied in a specific context in order to decide on the next step. As an example, let us consider a case for optimisation of the processing parameters. Attributes of the case can be split into 3 groups: (a) attributes concerning characteristics of the part: diameter, thickness, radius-of-curvature, etc.; (b) attributes concerning the material of the part, which imply chemical, thermal and mechanical properties, and (c) description of the surface error-map. The case-base communicates with data-bases, which contain relevant data about materials. The solution part of each case presents the values of parameters that the specialists set in the particular step, such as polishing mode, process angle (degree) head speed (rpm), tool offset (mm), tool overhang (mm), tool pressure (bar), rotation (degree), point spacing (mm), track spacing (mm), surface feed (mm/min). These parameter values serve as input to the tool-path generator software on the CNC polishing machine. The solution part of the case also contains a new error map which is the result of the specified CNC run.

In the design of a CBR system, it is necessary to define a similarity measure which determines which case from the case-base is the most useful for the current polishing process step for a new part. For example, the material is critical because it determines the removal rate and the optimum conditions to achieve texture. As another example, the radius of curvature affects the process angle-of-attack. The standard simple similarity measure is the k-neighbour similarity, which measures the weighted difference between attribute-values of the new case and cases from the case-base [19]. However, in this CBR system, we need a more sophisticated similarity measure to take into consideration how specific or general some values of case attributes are. For that reason, we plan to define ontology of concepts in ultra-high precision manufacturing, which will enable the system to infer the level of similarity between two concepts, how specific the concepts/values are, or what the level of commonality between two compared concepts is. A pair of specific values should be more important in the definition of similarity, than two concepts that may have the same value, but are rather general.

Methodology for the mechanics of automation

In terms of practical implementation of an autonomous cell, we have successfully combined an industrial robot with a Zeeko CNC polishing machine and an interferometer measurement station, as shown in Fig. 3. Preliminary work on this topic was previously reported [20]. Since then, the software and data acquisition and control infrastructure has been extensively developed. The following complete cycle – the core requirements for an autonomous cell - has now been demonstrated without any manual intervention:-

  1. 1.

    Robot places part in auto-change chuck on polishing machine

  2. 2.

    Polishing run is conducted

  3. 3.

    Robot lifts part from chuck, tips to expel surface-water, and replaces in chuck

  4. 4.

    Machine drainage auto-switched from slurry system to holding tank

  5. 5.

    Robot picks up a wash-down head and washes part

  6. 6.

    Machine drainage auto-switched back to slurry system

  7. 7.

    Robot picks up air-nozzle and raster-scans to dry the part

  8. 8.

    Robot transfers part to auto-change chuck on metrology station

  9. 9.

    Metrology station auto-aligned using interferometer fringes, and fringe data acquired

  10. 10.

    Robot transfers part to chuck on polishing machine ready for next cycle

Fig. 3
figure 3

Robot cell with CNC polishing machine and interferometer station

Discussion

We commenced this paper pointing out that machining parts to micron tolerances in fully-automated production lines, without manual intervention, is standard practice today. We have asked the question as to why this is not possible for ultraprecision surfaces, such as optics. This has led us to cite the diversity of materials in common use, the complexity of the underlying physics and chemistry of material-removal, resulting in the imperfectly deterministic nature of available processes. This has led us to consider the human expertise required to define an appropriate ab initio process-chain for a part with a new geometry or material, the need to review interim results in iterative refinement of surfaces, and the decision as to when ultimately to stop and accept the part.

This leads us to realise the fundamental importance of digitally-acquiring as comprehensive process and metrology data as is practicable; including real-time variables such as process-forces and slurry-conditions, plus computer-readable identification of interchangeable items such as fixturing, pads, tools etc. Given an accumulating data-base, data mining can then be used to reveal underlying patterns that may be beyond our current knowledge, for example, to detect complex cause-and-effect relationships between process variables and outputs. These can also inform the design of the components of CBR. For example, it may lead to revised weights in the similarity measure between cases, and may assist in the adaptation phase of the CBR as required to address the differences between a new case and the case retrieved from the case base.

In reviewing how to proceed from the current status, Fig. 4 is our synthesis of the primary work-packages required to implement an autonomous manufacturing cell.

Fig. 4
figure 4

Principal technical work-packages to deliver an autonomous cell

Currently, the physical implementation of automation using robots has been well-established, as described above. This benefits from the wide range of end-effectors and other automation components commercially available. The psychological aspects of capturing know-how of skilled operators has also been demonstrated, giving insightful guidance as to how a complex sequences of sub-operations are structured in practice. However, this work also presented ‘human factors’ challenges, and the approach is probably more practical in a research than a commercial environment.

We continue to work on several fronts to improve processes, for example deploying non-Newtonian tools to improve conformance and minimise potential for transitioning between removal-regimes [10, 11]. This follows the work of Kim and Burge [21], but has additionally explored the relationship between tool-rotation and materials properties, in order to enhance removal rates. We are also developing a novel method of measuring freeform surfaces, which will be reported in due course. We routinely manufacture fixtures and tools using additive manufacturing, but have yet to automate their design. As regards artificial intelligence, this will require considerable effort, and so we are carefully studying different strategies in preparation for the implementation phase. Finally, work has not started on Cell interfaces and cyber-security, although these are regarded as critical items.

Conclusion

From the work reported above, our view is that the way forward is to combine expertise captured predominantly in a research environment with i) comprehensive real-time process-data, ii) identification of interchangeable tooling etc., and iii) metrology data acquired between process-steps and on completion and iv) a data base of physical, chemical and thermal properties of substrate materials. Together, these will provide the basic feedstock information to bring to bear the artificial intelligence methods, such as described in this paper, to design ab initio process chains for new parts, optimise processes ‘on-the-fly’, review interim metrology results, and make process-decision to close the process loop fully-automatically. The impact of successfully achieving autonomous manufacture of ultraprecision surfaces is likely to be very significant, given the growing market and increasing shortage of highly-skilled practitioners in the field. Full automation without human intervention is extremely challenging, and well beyond today’s state-of-the-art. Nevertheless, we have started on this road.

Availability of data and materials

The datasets generated and/or analysed during the current study are not publicly available because the capturing of expert operator know-how was conducted on the condition that the data is proprietary and will be kept confidential. However, the conclusions presented in this paper are not dependent on any aspect of the data itself, but only on the experience gained acquiring it, which are described.

Abbreviations

AIS:

Autonomous intelligent systems

CBR:

Case based reasoning

CNC:

Computer numerical control

HPC:

High power computing

Parms:

Parameters

QR Code:

Quick response code

References

  1. Market Research Study Photonics 2017, Key Data, 3rd Edition, https://www.photonics21.org/download/ppp-services/photonics-downloads/Photonics21_3.-edition_Key-Data_Market-Research-Report-2018.pdf

  2. Diagnostic Imaging Market, Global forecast to 2020, marketsandmarkets (2016) https://www.marketsandmarkets.com/Market-Reports/diagnostic-imaging-market-411.html

  3. Tonnellier, X., Morantz, P., Shore, P., Compley, P.: Precision grinding for rapid fabrication of segments for extremely large telescopes using the Cranfield BoX. Proc. SPIE. 7739, (2010). https://doi.org/10.1117/12.858806

  4. Walker, D., Freeman, R., Morton, R., McCavana, G., Beaucamp, A.: Use of the ‘Precessions’ process for pre-polishing and correcting 2D & 2½D form. Opt. Express. 14(24), 11787–11795 (2006) ISSN: 1094–4087, Published by Opt. Soc. Am. on http://www.opticsexpress.org/

    Article  ADS  Google Scholar 

  5. Yu, Y., Walker, D., Li, H.: Implementing Grolishing process in Zeeko IRP machines. Appl. Opt. 51(27), 6637–6640 (2012)

    Article  ADS  Google Scholar 

  6. Walker, D., Yu, G., Li, H., Messelink, W., Evans, R., Beaucamp, A.: Edges in CNC polishing: from mirror-segments towards semiconductors, paper 1: edges on processing the global surface. Opt. Express. 20(18), 19787–19798 (2012)

    Article  ADS  Google Scholar 

  7. Li, H., Walker, D., Yu, G., Sayle, A., Messelink, W., Evans, R., Beaucamp, A.: Edge control in CNC polishing, paper 2: simulation and validation of tool influence functions on edges. Opt. Express. 21(1), 370–381 (2013)

    Article  ADS  Google Scholar 

  8. Preston, F.W., Soc, J.: Glass Technol. 11, 247 (1927)

    Google Scholar 

  9. Maury, A., Ouma, D., Boning, D., Chung, J.: Proc. Conf. Advanced Metallisation and Interconnect Systems for ULSI Applications, San Diego, California, Pub. Academic Press (1997)

    Google Scholar 

  10. Hocheng, H., Tsai, H.Y., Su, Y.J.: Modeling and experimental analysis of the material removal rate in the chemical mechanical planarization of dielectric films and bare silicon wafers. Electrochem. Soc. 148(10), G581–G586 (2001)

    Article  Google Scholar 

  11. Pal, R.K., Garg, H., Sarepaka, R.G.V., Baghel, P.: Friction at Workpiece-Polisher Interface during Optical Polishing Process. Int. J. Mech. Eng. 1(1), 32–35 (2014)

    Google Scholar 

  12. Hall, W, Pesenti, J: Growing the artificial intelligence industry in the UK. https://www.gov.uk/government/publications/growing-the-artificial-intelligence-industry-in-the-uk. Accessed 30 Jan 2019

  13. The National AI Research and Development Strategic Plan, UK. https://www.nitrd.gov/PUBS/national_ai_rd_strategic_plan.pdf. Accessed 30 Jan 2019

  14. Bresina, J.L.: Activity planning for a lunar orbital mission. AI Mag. 37(2), 7–18 (2016)

    Article  Google Scholar 

  15. Fox, M., Long, D., Guillaume, R., Sangulov, R.: Patent Method of Creating and Executing a Plan, WO2016100973A1, 23 (2016)

    Google Scholar 

  16. McCluskey, T.L., Vallati, M.: Embedding Automated Planning within Urban Traffic Management Operations. In: Proceedings of the International Conference on Automated Planning and Scheduling ICAPS (2017)

    Google Scholar 

  17. Collobert, R., Weston, J., Bottou, L., Karlen, M., Kavukcuoglu, K., Kuksa, P.: Natural language processing (almost) from scratch. J. Mach. Learn. Res. 12, 2493–2537 (2011)

    MATH  Google Scholar 

  18. Kolodner, J.: Case-Based Reasoning. Morgan Kaufmann, San Francisco (1993)

    Book  Google Scholar 

  19. Richter, M., Weber, R.: Case-based reasoning: a textbook. Springer-Verlag, Berlin (2013)

    Chapter  Google Scholar 

  20. Walker, D., Yu, G., Bibby, M., Dunn, C., Li, H., Wu, Z., Xiao, P., Zhang, P.: Robotic automation in computer controlled polishing. J. Eur. Opt. Soc.-Rapid. 11, 16005 (2016)

    Article  Google Scholar 

  21. Kim, D.W., Burge, J.H.: Rigid conformal polishing tool using non-linear visco-elastic effect. Opt. Express. 18(3), 2242–2257 (2010)

    Article  ADS  Google Scholar 

Download references

Acknowledgements

The authors wish to acknowledge research grants from UK-EPSRC (including the Network Plus initiative), UK-STFC, Innovate-UK and the UK Centre for Earth Observation Instrumentation. They also wish to acknowledge the support of the senior management at the anonymous company that participated in the experiments to capture know-how from expert machine operators, and the help of the one operator who did agree to participate. Zeeko Ltd. also played a significant role, particularly in its leadership of the “RoboZeek” Innovate-UK project, which developed the robotic automation of CNC polishing machines. Hongyu Li and his work on freeform metrology for future autonomous cells is financially supported by the EPSRC-funded University of Huddersfield Future Metrology Hub. Chenhui An’s substantial effort on mechanical design and finite element analysis for in-process force-measuring is also gratefully acknowledged.

Funding

The work reported under capturing the expertise of skilled operators was supported by a grant from the UK Engineering and Physical Science Research Council (EPSRC). The development of the CNC processing technologies on which the work depended was supported by EPSRC and the UK Science and Technology Facilities Council (STFC). Work on combining robots CNC polishing machines to automate manual interventions was supported by Innovate-UK and Zeeko Ltd. Metrology developments underpinning the automation work is supported by funding from the Future Metrology Hub at the University of Huddersfield, which is funded by EPSRC. H. Li also gratefully acknowledges financial support from this source.

Author information

Authors and Affiliations

Authors

Contributions

DDW originated the concept of the autonomous cell, leads the team working on it, and is the lead author and coordinator of the paper. TLM and SP have contributed to the manuscript and the work reported, in two complementary aspects of using artificial intelligence in this context. GY and HYL were the willing research-organisation expert-operator participants, and have contributed materially to the overall development of the Cell concept reported in this paper. All authors read and approved the final manuscript.

Corresponding author

Correspondence to David Douglas Walker.

Ethics declarations

Competing interests

DDW is a founding director and Research Director of Zeeko Ltd., and has a minority shareholding.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Walker, D.D., McCluskey, T.L., Yu, G. et al. Fully automating fine-optics manufacture - why so tough, and what are we doing?. J. Eur. Opt. Soc.-Rapid Publ. 15, 24 (2019). https://doi.org/10.1186/s41476-019-0119-y

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s41476-019-0119-y

Keywords