Past projects


ALPHA-STEM – Advanced Laboratory Phantoms for Soft Tissues in Engineering and Medicine

People involved: Antonio Forte, Elena De Momi blueprint-green

Funding source: H2020-MSCA-IF-2017
Grant number:  798244

Funding period: 2018 – 2021


Advanced Laboratory Phantoms for Soft Tissues in Engineering and Medicine: ALPHA-STEM

Research has shown that the success rate in many types of surgeries is strictly related to the experience of the surgeon. However, early in their career, trainees are not given the opportunity to operate on a sufficient number of patients nor to perform an exhaustive mix of procedures. The scenario has been further worsened by the reduction of assisted training hours in Europe (since 2009) and US (since 2011). Training and technical tasks are usually practised on cadavers, animals or using virtual simulators. However, all these alternatives present difficulties: limited availability, expensive handling and preservation processes (cadaveric training), nonhuman anatomical structures (animal training), costly set-up, and doubtful skills transfer to the real operating theatre (virtual simulators). A potential solution is to promote the use of artificial synthetic models, also known as phantoms. Phantoms are reproduction of human parts and organs that allow the trainee to practice positioning of the anatomical structures as well as hand coordination. Unfortunately, they lack of reliable tactile feedback (e.g. palpation) and real tissue deformation patterns which critically reduce the fidelity of the surgical training.

The main objective of this project is to overcome the present limitations by developing phantoms capable of providing detailed anatomical structures along with an accurate tactile response when performing surgical tasks such as cutting, indention and suturing. The proposed investigation is aimed at designing, making and testing synthetic advanced materials tailored to reproduce the mechanical response of different human organs and tissues (lung, brain, liver, skin, cartilage, etc.). Direct comparisons with experimental data on organic tissues and feedback from a number of experienced surgeons will be used to validate the effectiveness of the proposed solutions during this research journey towards safer surgeries.

SMARTsurg – SMart weArable Robotic Teleoperated Surgery

People involved: Elena De Momi, Giancarlo Ferrigno,  Nima Enayati, Hirenkumar Nakawala, Jacopo Buzziblueprint-green

Funding source: RIA H2020-ICT-2016
Grant number:H2020-ICT-2016- 732515

Funding period: 2017 – 2020

University of the West of England (United Kingdom), Ethniko Kentro Erenvas kai Technologikis Anaptyxis (Greece), North Bristol National Health Service Trust (United Kingdom), University of Bristol (United Kingdom), Istituto Europeo di Tecnologia (Italy), Idiotiko Poliiatrio Orthopaidikis Chirourgikis Athlitikon Kakoseon kai Apokatastasis Etairia Periorismeni Efthinis (Greece), Cybernetix (France), Optinvent (France), Hypertech Innovations Limited (United Kingdom)
Robot-assisted minimally invasive surgery (RAMIS) offers many advantages when compared to traditional MIS, including improved vision, precision and dexterity. While the popularity of RAMIS is steadily increasing, the potential for improving patient outcomes and penetrating into many procedures is not fully realised, largely because of serious limitations in the current instrumentation, control and feedback to the surgeon. Specifically, restricted access, lack of force feedback, and use of rigid tools in confined spaces filled with organs pose challenges to full adoption. We aim to develop novel technology to overcome barriers to expansion of RAMIS to more procedures, focusing on real-world surgical scenarios of urology, vascular surgery, and soft tissue orthopaedic surgery. A team of highly experienced clinical, academic, and industrial partners will collaborate to develop: i) dexterous anthropomorphic instruments with minimal cognitive demand ii) a range of bespoke end-effectors with embedded surgical tools using additive manufacturing methods for rapid prototyping and testing utilizing a user-centred approach, iii) wearable multi-sensory master for tele-operation to optimise perception and action and iv) wearable smart glasses for augmented reality guidance of the surgeon based on real-time 3D reconstruction of the surgical field, utilising dynamic active constraints and restricting the instruments to safe regions. The demonstration platform will be based on commercial robotic manipulators enhanced with the SMARTsurg advanced hardware and software features. Testing will be performed on laboratory phantoms with surgeons to bring the technology closer to exploitation and to validate acceptance by clinicians. The study will benefit patients, surgeons and health providers, by promoting safety and ergonomics as well as reducing costs. Furthermore, there is a potential to improve complex remote handling procedures in other domains beyond RAMIS.

EDEN2020 – An Enhanced Delivery Ecosystem for Neurosurgery in 2020

People involved: Elena De Momi, Giancarlo Ferrigno, Alberto FavaroMarco Vidotto, Sara El Hadji, Alice Segato EDEN2020 logo
Funding source: RIA H2020-ICT-2015
Grant number:ICT-24-2015- 688279

Funding period: 2016 – 2020

Due to an aging population and the spiralling cost of brain disease in Europe and beyond, EDEN2020 aims to develop the gold standard for one-stop diagnosis and minimally invasive treatment in neurosurgery. Supported by a clear business case, it will exploit the unique track record of leading research institutions and key industrial players in the field of surgical robotics to overcome the current technological barriers that stand in the way of real clinical impact.


EDEN2020 will provide a step change in the modelling, planning and delivery of diagnostic sensors and therapies to the brain via flexible surgical access, with an initial focus on cancer therapy. It will engineer a family of steerable catheters for chronic disease management that can be robotically deployed and kept in situ for extended periods. The system will feature enhanced autonomy, surgeon cooperation, targeting proficiency and fault tolerance with a suite of technologies that are commensurate to the unique challenges of neurosurgery. Amongst these, the system will be able to sense and perceive intraoperative, continuously deforming, brain anatomy at unmatched accuracy, precision and update rates, and deploy a range of diagnostic optical sensors with the potential to revolutionise today’s approach to brain disease management. By modelling and predicting drug diffusion within the brain with unprecedented fidelity, EDEN2020 will contribute to the wider clinical challenge of extending and enhancing the quality of life of cancer patients–with the ability to plan therapies around delicate tissue structures and with unparalleled delivery accuracy.

EDEN2020 is strengthened by a significant industrial presence, which is embedded within the entire R&D process to enforce best practices and maximise translation and the exploitation of project outputs. As it aspires to impact the state of the art and consolidate the position of European industrial robotics, it will directly support the Europe 2020 Strategy.
For more information, please visit EDEN2020 website.

Machine Learning-based Adaptive Robot-assisted Training for Surgical Robotics

People involved: Elena De MomiNima Enayati

Funding source: Intuitive Surgical Technology research grant

Funding period: 2017

Statement of Work:

The main hypothesis of this proposal is that training phases of a trainee can be recognized through performance measures and that a proportional physical guidance can measurably improve the outcome of the training procedure. The project comprises two principal aims:

Aim 1: Implementing trainee progress assessment through machine learning methods

The methodology for assessing surgical skills is gradually shifting from subjective scoring of an expert, which may be a variably biased opinion using vague and subjective criteria, towards a more quantitative analysis. This project aims at assessing various skills of a tele-operated surgical system’s operator through a Machine Learning (ML) powered agent. The assessment is in the context of training and thus the agent will be learning-phase-aware. For a set of predefined tasks, the ML agent will measure the performance of a trainee and produce an estimate of his/her progress with respect to the average learning curve of the specific task that can be used as an effective feedback to the trainee or in adapting accordingly the training program.

Aim 2: Design and evaluation of adaptive physical assistive methods in training

The second aim of the project is to investigate the effects of providing physical guidance in primary phases of tele-operated surgical training. This robot-assisted training guides the trainee through motion, force, torque, or vibration cues applied by the master device and is adapted to users’ progress modelled by the ML agent. Such an assistive method is hypothesized to facilitate the initial stages of training by preventing cognitive overload. These methods will be studied through multiple experiments using a tele-operation setup with statistically significant populations of subjects. The results of the study will reveal potential benefits or harms of motion guidance in skills training in terms of both immediate improvements and long-term retention.


People involved: Giancarlo Ferrigno, Elena De MomiActive

Funding source: FP7-ICT-2009
Grant number:FP7-ICT-2009-6- 270460

Funding period: 2011 – 2015

Department of Electronics, Information and Bioengineering – Politecnico di Milano, Milan, Italy, Istituto di Tecnologie Industriali e Automazione – Consiglio Nazionale delle Richerce, Milan, Italy, Department of Mechanical Engineering and the Institute of Biomedical Engineering – Imperial College, London, UK,  Institute for Process Control and Robotics – Karlsruhe Institute of Technologie (KIT), Karlsruhe, Germany,  Department of Advanced Robotics – Italian Institute of Technology,  Genova, Italy, Faculty of Mechanical Engineering – Technion, Israel Institute of Technology, Haifa, Israel, Lehrstuhl für Computeranwendungen in der Medizin, Institut für Informatik – Technische Universität München, Munich, Germany, Deutsches Forschungszentrum Fuer Kuenstliche Intellgenz GmbH (DFKI) Bremen, Germany, Department of Presurgical Epileptological Evaluation (DDEP) – Milan, Italy, Functional Brain Center – Tel-Aviv Sourasky Medical Center, Tel-Aviv, Israel, Force Dimension – Nyon, Switzerland, Renishaw Ltd., Medimaton Ltd. – Beaconsfield, UK,  CF Consulting – Finanziamenti Unione Europea S.r.l, Milan, Italy, KUKA Laboratories GmbH, Medical Robotics – Augsburg, Germany

The ACTIVE project exploits ICT and other engineering methods and technologies for the design and development of an integrated redundant robotic platform for neurosurgery. A light and agile redundant robotic cell with 20 degrees-of-freedom (DoFs) and an advanced processing unit for pre- and intra-operative control will operate both autonomously and cooperatively with surgical staff on the brain. As the patient will not be considered rigidly fixed to the operating table and/or to the robot, the system will push the boundaries of the state of the art in the fields of robotics and control for the accuracy and bandwidth required by the challenging and complex surgical scenario.

Two cooperating robots will interact with the brain that will deform for the tool contact, blood pressure, breathing and deliquoration. Human factors are considered by allowing easy interaction with the users through a novel haptic interface for tele-manipulation and by a collaborative control mode (“hands-on”). Active constraints will limit and direct tool tip position, force and speed preventing damage to eloquent areas, defined on realistic tissue models updated on-the-field through sensors information. The active constraints will be updated (displaced) in real time in response to the feedback from tool-tissue interactions and any additional constraints arising from a complex shared workspace. The overarching control architecture of ACTIVE will negotiate the requirements and references of the two slave robots.

The operative room represents the epitome of a dynamic and unstructured volatile environment, crowded with people and instruments. The workspace will thus be monitored by environmental cameras, and machine learning techniques will be used for the safe workspace sharing. Cognitive skills will help to identify the target location in the brain and constrain robotic motions by means of on-field observations.

For more information, please visit ACTIVE website.

Big Buck Bunny
youtube play
vimeo play


  • Active project
    Active project
  • Soft-robotics-and-motion-compensation-for-neurosurgery
  • Haptic-loop-with-real-time-force-feedback-for-neurosurgery-with-lwr4-under-impedance-control
  • Motion compensation
    Motion compensation
  • Motion compensation for active head frame
    Motion compensation for active head frame
  • Active headframe for neurosurgery
    Active headframe for neurosurgery


People involved:  Elena De Momi, Giancarlo Ferrignoeurosurge_logo_title

Funding source: FP7-ICT-2011

Funding period: 2011 – 2013


Project European Robotic Surgery (EuRoSurge) is a Coordination and support action funded by the European Commission in the FP7-ICT-2011-7.This Coordination Action aims at developing a conceptual integration platform for Computer and Robot Aided Surgery (CRAS) research and manufacturing, based on the following actions:

  • Identification of the key European players in surgical robotics, (both technological players, skilled end-users and EU funded projects);
  • Identification of the key European players in cognitive sciences relevant to surgery;
  • Creation of a glossary/ontology for cognitive surgical robotics;
    Specification of a reference architecture for cognitive surgical robotics;
  • Formulation of procedures for validation of surgical robots and their modules;
  • Identification of non-technical roadblocks, e.g. patents, ethical and legal aspects.

For more information, please visit EUROSURGE website.



People involved: Elena De Momi, Giancarlo Ferrigno

Funding source: FP7-ICT-2007 robocast

Funding period: 2008 – 2010


The ROBOCAST project focuses on robot assisted keyhole neurosurgery. This term refers to a brain surgery performed through a very small hole in the skull called burr hole. The reduced dimensions are the reason why it is called also “keyhole”.This surgery is carried out for several interventions, from endoscopy to biopsy and deep brain stimulation. Needles and catheters are inserted into the brain through the tiny hole for biopsy and therapy, including, among others the tasks of blood/fluid sampling, tissue biopsy, cryogenic and electrolytic ablation, brachytherapy, deep brain stimulation (DBS), diagnostic imaging, and a number of other minimally invasive surgical procedures. Related pathologies are tumours, hydrocephalus, dystonia, essential tremor, Parkinson’s Disease, Tourette Syndrome, clinical depression, phantom limb pain, cluster headache and epilepsy.

The ROBOCAST project outcome will be a system for the assistance of the surgeon during keyhole interventions on the brain. It will have a mechatronic part and an intelligence part. The mechatronic device will consist of a robot holding the instruments for the surgeon and inserting them in the brain with a smooth and precise controlled autonomous movement. The trajectory will be defined by the intelligence of the ROBOCAST system and will be approved by the surgeon, which is and remains the responsible of the outcome, before the insertion of the surgical instruments.

For more information, please visit ROBOCAST  website.

Robocast project on Euronews

Robocast project on Euronews; Preparation and interview of Prof. Giancarlo Ferrigno

Big Buck Bunny
youtube play
vimeo play


  • Robocast on Euronews
    Robocast on Euronews
  • Robocast demo in operating room
    Robocast demo in operating room
  • Target Follower
    Target Follower
This site uses cookies to enhance your experience. By continuing to the site you accept their use. More info in our cookies policy.     ACCEPT