Uh, OK. So a few decades ago a scientist I respect built his own scientific tool from parts (https://www.nature.com/articles/35073680) and I was really blown away by that idea, especially because most scientific tools are very expensive and have lots of proprietary components. I asked around at the time (~2001) and there wasn't a lot of knowledge on how to control stepper motors, assemble rigid frames, etc.
Although my day job is running compute infra, I have a background in biophysics and I figured I could probably do something similar to Joe Derisi, but lacked the knowledge, time, and money to do this either in the lab, or at home. So the project was mostly on the backburner. I got lucky and joined a team at Google a decade ago that did Maker stuff. At some point we set up a CNC machine to automate some wood cutting projects and I realized that the machine could be adapted to be a microscope that can scan large areas (much larger than the field of view of the objective). I took a Shapeoko and replaced the cutting tool with a microscope head (using cheap objectives, cheap lens tube, and cheap camera) and demonstrated it and got some good images and lots of technical feedback.
As I now had more time, money, and knowledge (thanks, Google!) I thought about what I could do to make scientific grade microscopes using 3d printer parts, 3d printing and inexpensive components. There are a lot of challenges, and so I've spent the past decade slowly designing and building my scope, and using it to do "interesting" things.
At the current point, what I have is: an aluminum frame structure using inexpensive extrusion, some 3d printed junction pieces, some JLCPCB-machined aluminum parts for the 2D XY stage, inexpensive off-the-shelf lenses and industrial vision camera, along with a few more adapter pieces, and an LED illuminator. It's about $1000 material, plus far more time in terms of assembly and learning process.
What I can do: the scope easily handles scanning large fields of view (50mm x 50mm) at 10X magnification and assembles the scans into coherent fullsize images (often 100,000x100,000 pixels). It can also integrate a computer vision model trained to identify animacules (specifically tardigrades) and center the sample, allowing for tracking as the tardigrade moves about in a large petri dish. This is of interest to tardigrade scientists who want to build models of tardigrade behavior and turn them into model organisms.
Right now I'm working on a sub-sub-sub-project which is to replace the LED illuminator with a new design that is capable of extremely bright pulses for extremely short durations, which allows me to acquire scans much faster. I am revelling in low-level electronic design and learning the tricks of trade, much of which is "5 minutes of soldering can save $10,000".
I had hoped to make this project into my fulltime job, but the reality is that there is not much demand for stuff like this, and if it does become your job, you typically focus on getting your leadership to give you money to buy an already existing scope designed by experts and using that to make important discoveries (I work in pharma, which does not care about tardigrades).
Eventually- I hope- I will retire and move on to the more challenging nanoscale projects- it turns out that while you can build microscopes that are accurate to microns with off-the-shelf hardware is fairly straightforward, getting to nanoscale involves understanding a lot of what was learned between the 1950s and now about ultra-high-precision, which is much more subtle and expensive.
Here's a sample video of tardigrade tracking- you can see the scope moving the stage to keep the "snout" centered. https://www.youtube.com/watch?v=LYaMFDjC1DQ
And another, this is an empty tardigrade shell filled with eggs that are about to hatch, https://www.youtube.com/watch?v=snUQTOCHito with the first baby exiting the old shell at around 10 minutes.
Yes, I took an existing vision model that could run at realtime on my laptop, and fine-tuned it with a few hundred manually labelled images of tardigrades.
I don't like the openflexure design at all. I mean... obviously it works for a lot of people, but I just don't want a flexure based stage. I like real 2-axis stages based on rolling bearings, basically cloning the X and Y parts of this: https://www.asiimaging.com/products/stages/xy-inverted-stage...
UC2 is another cool project: https://openuc2.com/ but I found their approach constraining.
Frankly I think you could just buy an inexpensive 3D printer that had an open firmware, and replace the extruder with an objective, a tube, and a camera, and you'd have something up and running cheaper for less time.