Commit a1b083a0 authored by Jake Read's avatar Jake Read

february docu update

parent c850cdb7
## DEX
## DEX LOG
## 2019 11
### Vision Controller Hello-World
We're currently working to build a computer vision based displacement sensing method for the DEX. Since our machine (or, many machines manufactured by novices / in the public domain) are liable to flex (indeed, nothing is infinitely stiff!), the thought is to measure local displacements of the sample, at the sample, rather than measuring open-loop through the machine's structure.
To spin this up, I've written a small / barebones subpixel template tracker in the browser, in cuttlefish. This is conveniently lightweight - the whole cycle (image collection -> analysis) can happen at ~ 10Hz, which is not splendid, but not diabolical either.
![vd](video/2019-11-17_micrometers-cv.mp4)
Here we can see a desktop test - I am reading the X- position from my tracking system on to a chart, and moving the tracker on a linear stage. This system resolves ~ 15um, which is not bad for a proof of concept.
## 2019 10
Re-did the machine last week, now much simpler:
![dex](images/2019-09-27_DEX-CAD.png)
...
## Comparison to Instron 4411 (2019 10)
To see how we do against a real instron, I tested identical samples on the DEX as well as on an Instron '4411' with a 5kN load cell. In the plot below (I'm using cuttlefish to plot the .csv that I saved from Bluehill, the Instron software), the leftmost plot is taken on the 4411, and the lazier slope belongs to the DEX.
While the samples fail around the same load, the difference in elongation is ~ 1.5mm wide: this is almost surely the machine's own deflection, stretch in the belts, etc.
![dex-compare](images/2019-10-17_data-compare.png)
This obviously warrants correction. One way to do this is to build a stiffer machine, however, we will be chasing up the cost and complexity if we do this. Rather, we should throw some more control at it. To start, we can circle back to our attempts at [subpixel tracking](https://gitlab.cba.mit.edu/calischs/subpixel_tracking), or attach a small linear stage directly to our fixturing elements. For this, I am imagining something like the [AMS5311](https://ams.com/as5311), which should do some 12 bits inside of a 2mm throw (for 0.4um resolution). Either can be added to existing systems, given network controllers / modular browser code. Since I want to integrate it elsewhere, it's likely that the camera option comes first.
**update** the new machine uses a ballscrew transmission, not a belt (as used here) which should elminate most of the creep tested here against the 4411. forward progress has been made with a vision based controller, to improve further, but has not been characterized yet.
#### Some Force Maths (and a .xlsx file)
......
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
Markdown is supported
0% or
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment