3D Printing based on the CT data of measured object can print out the outer surface structure and internal surface structure of the object and can be used for reverse engineering. This paper presents a frame for generating 3D printing data directly from CT data. The frame includes three steps: firstly, surface mesh data with internal structure is extracted from CT data. Secondly, a mesh model recognized by 3D printer is obtained through generating topological information, removing isolated facets and non-manifold facets, filling holes and smoothing surface. Thirdly, for the target surface mesh model with internal structure or exceeding the maximum size of objects that 3D printer can print, the frame splits the mesh model into several parts and print them separately. The proposed frame was evaluated with a simulated CT data and two real CT data. These experiments showed that the proposed frame is effective to generate 3D printing data directly from CT data and preserve the shape analogy with the original object model with high precision.
A method for x-ray image-guided robotic instrument positioning is reported and evaluated in preclinical studies of spinal pedicle screw placement with the aim of improving delivery of transpedicle K-wires and screws. The known-component (KC) registration algorithm was used to register the three-dimensional patient CT and drill guide surface model to intraoperative two-dimensional radiographs. Resulting transformations, combined with offline hand–eye calibration, drive the robotically held drill guide to target trajectories defined in the preoperative CT. The method was assessed in comparison with a more conventional tracker-based approach, and robustness to clinically realistic errors was tested in phantom and cadaver. Deviations from planned trajectories were analyzed in terms of target registration error (TRE) at the tooltip (mm) and approach angle (deg). In phantom studies, the KC approach resulted in TRE = 1.51 ± 0.51 mm and 1.01 deg ± 0.92 deg, comparable with accuracy in tracker-based approach. In cadaver studies with realistic anatomical deformation, the KC approach yielded TRE = 2.31 ± 1.05 mm and 0.66 deg ± 0.62 deg, with statistically significant improvement versus tracker (TRE = 6.09 ± 1.22 mm and 1.06 deg ± 0.90 deg). Robustness to deformation is attributed to relatively local rigidity of anatomy in radiographic views. X-ray guidance offered accurate robotic positioning and could fit naturally within clinical workflow of fluoroscopically guided procedures.
New technology is bringing the power of augmented reality into clinical practice.
The system, called ProjectDR, allows medical images such as CT scans and MRI data to be displayed directly on a patient’s body in a way that moves as the patient does.
“We wanted to create a system that would show clinicians a patient’s internal anatomy within the context of the body,” explained Ian Watts, a computing science graduate student and the developer of ProjectDR.
The technology includes a motion-tracking system using infrared cameras and markers on the patient’s body, as well as a projector to display the images. But the really difficult part, Watts explained, is having the image track properly on the patient’s body even as they shift and move. The solution: custom software written by Watts that gets all of the components working together.