Effect of the global pandemic on science

The detrimental effect of the global pandemic has revolutionised experimental methods and academic learning in all sciences. For the first time, hands-on experimental applications were brought to a standstill due to the inability to access laboratories. Furthermore, 1.3 billion school and university students in 186 countries cannot attend classrooms and are currently pursuing education online [1]. However, while this shift allowed students to learn from any point on the globe, the main complication arises when students are not able to enter laboratories and acquire experience in performing experiments. The recent demand for remote experimentation using robotic appliances can potentially establish a working model for students and scientists despite the global pandemic. Nanoparticle (NP) synthesis is typically carried out using manual manipulations and methods requiring more staff, resources and time. A fully functional robotic system will allow overcoming these hurdles by creating a ‘closed-loop’ for NP synthesis, immediate analysis and optimisation (Fig. 1). By integrating AI into software-operated lab machinery and establishing a remote access to a computer, anyone from any part of the world will be able to participate in experimental procedures and gain valuable research skills. With this, however, comes a separate responsibility of maintaining the integrity of the computational system, protecting the users’ Intellectual Property (IP) and reducing the threat of unauthorised network access. Mitigation of network security breaches require multiple network protection measures, such as strengthened user access control, up-to-date and standardised software, two-factor authentication, firewall access restrictions, enabled network level authentication and employee network security protocol training [2].

Fig. 1
figure 1

Schematic diagram of an AI-automated workstation. A functional AI-guided robotic nanoparticle synthesis and analysis system will start with genetic analysis of individual patients and personalised treatment selection. The nanoparticle will then be formulated and tested in cells and organs-on-chip using automated systems. The AI system will ultimately be able to determine the best formulation and treatment for individual patients for maximum response and optimal outcome

Nanotechnology in drug and siRNA delivery

Implementation of NPs dates back to the 1970s when nanoscale liposomes were loaded with medication and delivered to diseased cells [3,4,5,6,7]. The application of nanotechnology to numerous fields is becoming increasingly realised, especially in personalised medicine [8,9,10]. NPs grant the ability to overcome biological barriers and allow effective delivery of drugs and other compounds whilst promising to preferentially target drugs to specific biomarkers in individual patients [11,12,13,14,15]. However, the formulation of NPs comes with numerous challenges [16]. The properties of NPs are determined by several key characteristics, including size, surface morphology and charge, chemistry and drug release profile [17,18,19]. Subtle changes in the formulation process and composition can alter the properties of the nanomedicine leading to unwanted consequences and hurdles in reproducing experiments [18,19,20,21]. Hence, having better control of the stages involved in manufacturing NPs is crucial. Aside from the delivery of small-molecule drugs, NPs are increasingly being used in the delivery of biologics, including small interfering RNA (siRNA) and proteins [22,23,24,25,26,27,28,29]. NPs help to overcome the challenge of biological instability and poor cellular penetration of genetic payloads [30,31,32]. The applications of siRNA-nanoparticle complexes encompass numerous diseases like cancer and viral infections [33,34,35,36,37]. There are various siRNA and drug combinations complexed with different compositions of lipid NPs (LNPs) and other NPs [38,39,40]. Therefore, careful selection and formulation of particles is key in the complexation step. Establishing a system that screens for optimal nanoparticle preparation will pave the way for more efficient siRNA and drug delivery and its implementation as personalised medicine tool.

Microfluidics and automated synthesis of nanoparticles

Microfluidics, the study and manipulation of fluids at micro- and nanoscale levels, allows improved control of particle characteristics and ensuring that variation between synthesised batches is reduced [41]. Due to these benefits, it is becoming increasingly used within laboratories. Microfluidic chips are a type of microfluidic device that allows for tiny amounts of liquids containing particles to be processed and visualised [42]. The chips are a combination of micro-pumps, micro-valves, micro-mixers, micro-separators and micro-sized channels (diameters ranging from 1 to 1000µm) [42, 43]. The pumps move liquid within the channels in the chip at micro-level flow rates, allowing control over physical or chemical reactions. The liquids may contain either cells or nanoparticles. These microfluidic devices have opened an extensive range of research possibilities that were previously unachievable. For example, microfluidic chips were used to study the migration of lung cancer cells under different cancer invasion microenvironments [44].

Notably, research and industrial operations have established efficient ways in utilising microfluidics for large-scale NP production [45,46,47,48,49,50,51]. Currently, microfluidics is most commonly used for the synthesis of LNPs for RNA/DNA delivery [52, 53]. However, the production of LNPs, using microfluidics, is still being further refined [54, 55]. For example, investigation into utilising a microfluidic approach to produce droplet-stabilised giant unilamellar vesicles (dsGUVs) has shown promising results compared with conventional methods of synthetic cell synthesis [56]. In addition, shaping NPs’ configuration, using microfluidic techniques, is gaining traction and providing a deeper understanding of NP synthesis and physiochemical assembly properties of the molecular building blocks [39, 57]. For example, PRINT, a nanofabrication technique, granted the ability to generate particles with full control of the size, shape and surface chemistry, which affects their cellular internalisation and intracellular trafficking [58]. Achieving particle synthesis automation by combining a robotic liquid handler with microfluidic devices holds great promise for the future of laboratories and industries.

Liquid handling and robotics

Automation has become a focal point for biomedical research. An essential part of any nanotechnology lab is the process of liquid handling. Historically, liquid handling has been a manual task whereby the user controlled liquid reagents through manual pipetting. When considering manual tasks in practical experimentation, it is often associated with human error and time and space limitation [59]. A monotonous task is an example of when a human error can occur, leading to a wide range of errors [60]. In many scenarios when unexpected results are received, the first question is whether there were sample and fluid handling errors [61]. A study performed by Schwarze et al. [62] showed that 15% of the cost related to genome sequencing is due to laboratory personnel errors. In efforts to reduce human errors, robotic liquid handlers (LiHas) are becoming increasingly utilised. Robotic LiHas can perform simple human functions such as aspiration and dispensing of liquids into different tubes and wells, whilst maintaining high precision and accuracy [63]. However, the range of robotic LiHa functions can be expanded into a multifunctional workstation for chemical and nanoparticle synthesis. Expansion of the functions can be achieved through the integration of robotic arms and various laboratory devices, such as centrifuges, microplate wasters and readers, heat sealers, heaters and shakers, bar code readers, storage devices and incubators [63]. Thus, robotic LiHas can be optimised for numerous techniques, including ELISA, PCR, genomic research, thin layer chromatography (TLC) spotting, solid-phase extraction (SPE), liquid-liquid extraction, nucleic acid preparation and cell-nanoparticle tests [64, 65]. Furthermore, there are dual robot arm liquid handlers that can emulate similar movements of a human, thus giving it a high number of degrees of freedom. In addition, acting as a 1:1 representation of a manual, human-operated, experiment, while in reality being an automated procedure [66]. During the current pandemic, robotic LiHas provide advantages and improved safety when it relates to testing of SARS-CoV-2 (COVID-19) [67]. Reducing the exposure of the laboratory technicians to COVID-19 samples is an excellent display of a programmable LiHa advantage. Such systems provide consistent performance, increased throughput and improved accuracy and precision, whilst reducing human error. The automatisation of NP synthesis using the LiHa possesses significant advantages, in comparison with the manual preparation methods. Robotic systems allow preparation of multiple formulations in a short period of time, characterising their activity on a biological specimen and optimising the nanoparticle formulation in an automated manner (Fig. 3).

Automated microscopy

A crucial part of establishing an automated nanotechnology laboratory resides in the application of nanomedicines to cells, to test particle uptake and their effect on cell morphology and viability. In particular, automated confocal microscopes containing incubators allow live-cell imaging and characterisation of fluorescent components [68]. Despite the high accuracy and precision, manual confocal fluorescence microscopy is a time-consuming task when analysing hundreds of samples [69]. The need for automated microscopy has increased when the possibility of measuring a 384-well-plate in a single field-of-view in only several minutes became a reality [70]. In addition, automated microscopes are capable of producing high-quality 3D images with higher acquisition rate and much wider fields of view allowing encompassing the whole microplate instead of focusing only on individual wells [71]. This allows users to observe minuscule structures within the cell and on the cell’s surface [72, 73]. Incorporating a fluorescent microscope into a robotic workstation can improve the efficiency of acquired results and determination of nanoparticle trafficking within, and effect on, each cell [74]. Features such as wide field of view camera and laser autofocus improve quality and speed of image acquisition and enable imaging even in non-glass–bottom plates [73]. The advantage of being able to work with live cells can further improve experimentation in terms of multiple treatments and recording changes in cells over time. Furthermore, rapid automated analysis of collected data following image acquisition can be a potential improvement of current systems. Acquiring large datasets of microscopic images and incorporating machine learning into biotechnological analysis allows for the maintenance and improvement of existing databases and subsequently applying the knowledge to many disease models and personalised approach to medicine [75].

Artificial intelligence in research

Nanomedicine offers new ways for preventing and treating diseases. However, reaching the full potential of nanomedicine is still yet to be fully realised [76]. The use of automation is a step in the right direction of manufacturing nanoscale drugs but is only one piece of the puzzle. In order to improve and ensure that nanomedicine will achieve its desired result, computational analysis of large amounts of data must take place [77]. Therefore, the next step is integrating AI and machine learning into the evaluation and formulation of nanoscale drugs [78, 79]. While traditional computational methods require deep understanding of the physical, chemical and biological knowledge to construct relevant and accurate computational models [77], AI algorithms require only training datasets that can be produced by automated synthesis practices, or sourced from the literature [80, 81]. Providing large datasets of experimental results related to the subject of study allows the algorithm to produce accurate prediction models that then can be translated into improved nano-formulations.

There are multiple areas where machine learning can be integrated into nanomedicine applications [77, 82, 83]. For example, machine learning can be used to improve the understanding of how the structure of a nanoparticle affects its characteristics as well as its interaction with targeting tissues and cells. A study, using machine learning, of the adverse effects of nanoparticle properties conducted by Puzyn et al. predicted cytotoxicity of 17 different metal oxide NPs to Escherichia coli [82]. Alternatively, the AI subfield can help determine the correlation between drug dosage and therapeutic outcomes. A recent study obtained gene expression profiles from 82 breast cancer patients and trained a machine learning algorithm to predict complete pathologic responses with an accuracy of 92% [84]. It should also be noted that there are algorithm approaches that can be used when there is not enough data available to train the algorithm. However, it is imperative that the algorithm choice is appropriate for the experiment. An example of utilising a machine learning algorithm that does not require large sets of data is artificial neural network (ANN) [85]. ANN was used to predict optimal size and drug release in a recent study conducted by Baghaei, B. et al. about polylactic-co-glycolic acid (PLGA) NPs [85]. The machine learning algorithm reduced the prediction error from 28.0 to 2.93% and from 19.4 to 2.99%, for particle size and initial burst release of PLGA NPs, respectively [85]. The study shows that ANN integration into NP drug release experimentation provides an improved alternative to traditional computational measurements. However, to further advance the field, pathologists and AI experts need to work closely together to produce improved AI systems with accurate outcomes using automated robotic systems for particle synthesis. Integrating AI-driven data analysis and processing will allow faster and cheaper drug discovery, screening and application in both laboratories and industry [86].

The future of remote science and AI in nanotechnology automation

AI is integrating into clinical and laboratory research, as seen in Fig. 2. However, various barriers are affecting the development and adoption rate of machine learning in laboratories. One example is the limitation of sufficient datasets to inform machine learning results. To rectify, this will require changing the structure of chemical laboratories, opening them to students and experts from numerous fields. These new team members will work alongside chemists, biologists, bio-chemical engineers and biotechnology engineers (and others) to achieve optimal performance, relevance and results. Overall, the automated approach of nanoparticle preparation and utilisation holds great promise to create an effective method of conducting research and improving the field of personalised medicine, as seen on Fig. 3 that describes key differences between the approaches.

Fig. 2
figure 2

In the field of pathology, AI is already making major changes. Similar impact of AI can occur in the field of nanomedicine synthesis and prediction. Graphical representation of PubMed results based on two search queries—[“Machine Learning” AND “Laboratory Medicine” and “Machine Learning” AND “Pathology”]

Fig. 3
figure 3

Graphical comparison of the hands-on and automated approaches to experimentation

Recent modifications of nanotechnology are rapidly evolving and adopting novel technological formats using machine learning and integrated analytical systems. Artificial intelligence demonstrates significant potential in “closing the loop” of nanoparticle synthesis, characterisation, refinement and testing or predicting activity in vitro and in vivo. The journey initiated from experimental laboratories is still lined with multiple challenges and obstacles along the way. However, recent studies and advancements on AI integration, automation and nanotechnology glimmer with great confidence, indicating that we are heading into the right direction.