Precision Healed, Comfort Delivered

We help a small orthotics shop that turns 3-D foot scans into printed insoles. The scans arrive full of holes, paper-thin overlaps, and stray bits of floor or sock. Off-the-shelf “repair” buttons either smooth away the tiny curves that make an insole comfortable or leave sharp ridges that hurt. The shop asked us for a tool that keeps every personal detail while erasing the junk.

So we built one from scratch. First, we turn each messy scan into a tidy map of faces and edges, almost like rebuilding Lego bricks in the right order. Then we choose one of four simple rules—think of them as different ways to decide what counts as “inside” the foot and what doesn’t. Finally, we stitch shut any gaps smaller than the printer’s nozzle and reshape new surfaces so they follow the natural curve of the foot. The whole fix takes less than a second and sends a ready-to-print file straight to their slicer.

Since switching to our pipeline, the shop no longer spends late nights fixing files by hand. Printing starts two hours earlier, and most customers pick up their insoles the same afternoon. We still run into the odd sock edge that fools our code or the occasional giant scan that freezes the preview, but those hiccups are next on our to-do list.

Common Questions

Q: Will your mesh-healing tool work with the scanners and file formats we already use?
- Yes. We ingest any common triangle format—STL, OBJ, PLY—and we don’t depend on a specific scanner brand. If your hardware exports something unusual, we add a reader for it during onboarding; that’s typically a one-day task.

Q: How does the repair step fit into our existing workflow? Do we have to learn new software?
- No new interface is required. The repair engine runs as a small command-line or gRPC service on the same PC that hosts your slicer. Drop a file in the “incoming” folder (or call the API), and a printer-ready mesh appears seconds later in “outgoing.” Your technicians keep using the tools they already know.

Q: Can the algorithm be tuned for the special features of our products—say, softer arches or embedded sensors?
- Absolutely. Every parameter—gap size, curvature tolerance, even which “inside/outside” rule is active—lives in a human-readable JSON profile. We ship a sensible default, then refine those numbers with sample scans from your production line so the healed mesh matches your exact comfort and durability targets.

Contact Elf.3D to explore how custom mesh processing algorithms might address your unique challenges. We approach every conversation with curiosity about your specific needs rather than generic solutions.

*Interested in discussing your mesh processing challenges? We'd be happy to explore possibilities together.*

Fixing Broken 3D Scans for Better Prosthetics

At Elf 3D we occupy a very narrow slice of the technology world: we write small, highly-specific geometric algorithms for teams that the big vendors overlook. Our latest collaboration was with an orthotics workshop that prints personalised foam insoles and prosthetic covers. Every model they produce starts as a hand-held 3D scan of a real limb, usually captured in a clinic or even in a living-room. This freedom makes life easier for patients, but it fills every mesh with gaps, folds and random noise the commercial repair plug-ins can’t handle.

The workshop was clear: “Please clean our meshes—without sanding away the subtle contours that keep our clients comfortable.” They didn’t need AI hype, cloud subscriptions or yet another licence fee. They needed a dependable algorithm that ran on the same workstation as their slicer. That is exactly the kind of brief we live for.

The digital wreckage we inherited

Generic “one-click fix” tools either smoothed the model into a potato or kept the defects intact. We needed surgical precision, not blunt force.

Building an algorithm from the ground up

We wrote the repair code from a blank repo, so every line targets the client’s scans and nothing is weighed down by third-party baggage. First, the raw “triangle soup” is rebuilt into our own volumetric half-edge graph, where every vertex, edge, and face remembers where it came from; if a piece later vanishes, we know exactly why. We detect self-intersections with exact arithmetic to avoid the tolerance tricks that usually blur fine anatomical details.

Next, we decide what belongs inside the model and what doesn’t. Four rules—Even-Odd, Oriented-Sum, Union, and Intersection—cover messy walk-around scans, asymmetric arches, merged partial scans, and quality checks, respectively. A technician picks the rule with a single JSON flag, no menus required.

Finally, an adaptive patcher closes only the gaps the printer can resolve. It inserts a Delaunay surface, projects it to follow the local curve, and merges any sliver thinner than the nozzle width, so edges don’t bulk up. Normals are unified automatically, because the slicer rejects mixed orientations. The whole repair finishes in under a second on an ordinary desktop.

The bumps in the road

Detail loss in shallow wrinkles: Our first prototype mistook genuine plantar wrinkles for noise, erasing subtle pressure-relief channels. We rewrote the classifier to accept a per-file minimum-thickness hint supplied by the workshop.

False positives on fabric: A sock cuff hugging an ankle still fools the system; a technician currently sketches a “keep” mask before repair.

UI freezes on huge scan: Anything above eight million triangles locked the preview. We’re experimenting with live decimation and GPU streaming, but that remains on the to-do list.

Life after healing

Today the workshop exports a raw scan, ticks one checkbox, and receives a printer-ready STL without opening Blender. 3D-printing starts earlier in the day, so most clients now pick up their insoles the same afternoon instead of “sometime tomorrow.” The change is modest, but for a sports rehab patient or an elderly customer, a single day matters.

What comes next

We’re extending the engine with selective remeshing: densifying the plantar arch while coarsening flat support zones to cut print time further. We also plan to embed a cloth-detector that separates woven texture from skin without manual masks. And yes—live previews that never freeze, even on monster scans. The algorithm is a living thing; as the client’s needs evolve, the code evolves with them.

About Elf.3D: We are a small team of three engineers and one project manager. Rather than selling the same software to everyone, we create new code for each partner and their specific data. If your 3-D workflow suffers from geometry issues that mainstream tools overlook, contact us at info@elf3d.com. We will start with a blank repository and craft a solution around your needs.