Showing posts with label programming. Show all posts
Showing posts with label programming. Show all posts

Friday, April 25, 2014

Wearable Electronics: Arduino X ATTiny85 with WS2812/2811 addressable RGB LED strip

Wearable Electronics: Arduino ISP ATTiny85 with WS2812/2811 addressable RGB LED strip
Many moons earlier, Mr.ChongSP ordered some individually addressable RGB LED strip (60LED/m) that uses WS2812 from middle country: aliexpress. Adafruit has a similar product, the Adafruit Neopixel. Yours truly manage to “BBS” 1x courtesy of Mr.ChongSP. Hardware is easy to procure, but “free” time to play with this new toy is hard to come by. Yours truly is working to the tune of “The Beatles-eight days a week” for the last couple of months. Nonetheless, when responsibility comes (to appear alongside Mr.JolyonC on stage for the Freshman Orientation Program “From Faraday to Fusion”), there is a need to push out the boat. The thing supposed to be a stage piece to grab freshie’s attention, so, addressable RGB LED on a jacket it shall be.

Since it is a wearable, there are several design (engineering) considerations. First, the Power supply unit, to power both the microcontroller (MCU) and the RGB LED Strip. Secondly, the size of the electronics package and thirdly, diffuser for RGB LED strip.
Using Arduino is a popular choice, given the footprint of it, it is easy to spot from a far. Not a good design choice. Furthermore, Arduino requires 9V supply whereas WS2812/2811 addressable RGB LED strip requires 5V. If both are used together, there is a need for a separate circuitry with both 7895 5v and 7809 9v voltage regulators, with supply being 12V 1A. Having said that, 12V battery pack and voltage regulator circuitry increase the footprint of the total package.
Therefore, the logical choice would be ATTiny85 that can be powered off from a 3to6V battery; PSU is a 5V 1A mobile power bank with USB connector that is meant for charging smart phones on the go. The choice of this PSU can be used to power both the addressable RGB strip and also ATTiny85.
As for the diffuser, 3D printed spikes using ninjaflex http://shin-ajaran.blogspot.sg/2014/04/3d-printing-using-ninjaflex-with.html !
closeup
Parts needed
1x Arduino as the In System Programmer (ISP); detail guide here http://shin-ajaran.blogspot.sg/2014/01/setting-up-software-for-using-arduino.html
1x ATTiny85 break out board of choice; using custom MitG PCB
1x ATTiny85
1x WS2812/2811 addressable RGB LED strip (it can be neopixel from adafruit, or pseudo neopixel from middle country:aliexpress)
Courtesy of the techno arsenal available to SP MAKERS CLUB, all of the parts listed above are available, as per display in the image above.
Setting up the software environment
Although the RGB LED strips on hand are not Adafruit Neopixel, thanks to the Adafruit neopixel code library released on github https://github.com/adafruit/Adafruit_NeoPixel , it really saves a lot of time and effort by not re-inventing the wheel.
The “now” trend of selling a product over the Internet: In order to increase ownership from non-techines, the manufacturers will produce very detail idiot proof step by step guide. In neopixel example, the UBERGUIDE is definitely very useful. In the hands of a techie, it became some sort of cheat code, using it far from the intended purpose.
Thus, follow the UBERGUIDE (URL is in references section at the bottom of this post) closely to setup the software environment.
Setting up the hardware
Wiring is dead easy. Data pin from the addressable RGB LED strip to pin of your choice. In this case, digital pin4 of ATTiny85. NOTE: must ensure common ground by connecting ground wire of RGB LED strip to ground pin of MCU.
Programming the hardware
First, ensure all the hardware are setup accordingly to the tutorial earlier. E.g ATTiny85 on the ISP shield, ISP shield on the Arduino. USB cable is plugged in, and drivers are installed for Arduino.
Set the “programmer” to Arduino as ISP, as depicted in the image above.
Set the “board” to ATtiny85 8MHz. Rule of thumb, choose 1MHz if low energy consumption is required, the downside, computation speed of the code will be affected. 20MHz option will not work out of the box, this option requires an external oscillator as the clock.
Set the “Serial Port” to the one detected on your computer.  
Double check the “parameters” on the lower right hand corner as depicted in the image above.
Everything seems to be prim and rosy. But we are not done yet.
If ATTiny85 is new out of the box, need to do the following step to burn the “bootloader”. Otherwise the code will be compiled, downloaded accordingly, but could not start the program upon reset.
Else, the above can be skip if the ATTiny has been programmed at 8Mhz.
Source code is in public domain using the following URL
Now, click the upload button! Fingers crossed if it pleases you.
Plug in the USB power bank to test.
Wearable electronics = WS2812/2811 RGB LED strip+ ninjaFlex + ATTiny85 + Arduino + USB Mobile Power Bank + visibility vest
Video 


Notes:
ATTinyXX MCU has very small memory footprint; having a code footprint that is above 4k will trigger an error “relocation truncated to fit: R_AVR_13_PCREL”. Use the patch and description of the error from the following URL http://forum.arduino.cc/index.php?topic=116674.0 to fix this error
References


Saturday, April 12, 2014

\m/ rock on rave helmet for electric run, electro dance music

\m/ rock on rave helmet
Sick of generic off the shelf item for events such as electric run or electro dance music festival????
Make an customized item!
Earlier, I have devised a wirelessly charged RGB LED fiber optic bangle for the missus. She is going to be my pacer, thus when we go for a run and assumed we run close enough to each other, there will be LIGHT.  Just come to realised I have no missus, and the bangle size was not designed for a bloke, I had it shelved. Therefore I devised this helmet specifically for the events above. This rave item is sort of a motivation factor for yours truly the fatty bom bom to flex some muscle besides juicing the grey matter. I am also partially motivated after seeing Natalina’s Fiber optic dress; it is incomplete without a blinking rave head gear of some sort. 

Bills of material
1. Programmable RGB LED light source of some sort using an MCU. I have used my custom PCB for RGB LED to use with ATTiny85. Details of designing the PCB is available here, close up of the assembly of the contraction is available at my earlier instructables.
2. Light Diffuser of some sort. I have devised and 3D printed a \m/ rock on insignia in natural PLA with 5% fill and 2shells.
3. Side glow fiber optic cables.
4. Helmet.
Step0
Acquire the components and decide on how to route the fiber optics and measure the length needed. Assemble the programmable RGB LED light source PCB. The final assembly should look something similar to the following pictures. I have some surplus through hole LED diffuser lying around, so I have repurposed them to hold the RGB LEDs.
Step1
3D printed a \m/ rock on insignia in natural PLA with 5% fill and 2shells. The 3D model of \m/ rock on is uploaded to thingiverse. Feel free to download. Assemble the contraption as per the following picture. Insert the 5mm fiber optic cables and it should fit snugly.
Step2
Program the MCU. In this case, it is an ATTiny85.  The public domain RGB LED spectrum fading source code is available here.  
Step3
Fingers crossed. Plug in a 6v supply. Igor, PULL THE SWITCH!!!

A video will follow later once I find a human willing to wear it. I find it very difficult to take a selfie with my overgrown smart phone while wearing my new contraption.
Wearing my new rave helmet, I was prancing to the venue for electric run but only to realized it is an paid event. I thought it is FREE.... silly me. From public domain info, apparently early birds that book the run enjoy a huge discount as compared to late bird like me that try to sign up late. So, I decide to keep the cash for some Mackey Ds’ and continue to be a fatty bom bom.

Monday, March 3, 2014

Project you are the ONE. a Wireless powered fiber optic side glow diffused bling

Project you are the ONE. a Wireless powered fiber optic side glow diffused bling
When I was kid, I was fascinated by the world where Tesla and Edison live in. What intrigued me most was the constant debate of AC and DC (at that time), and Tesla’s vision of having power transmitted wirelessly. No cables necessary, no copper mined unnecessarily and friendly to all humans. Wireless power transfers (inductive charging) at that time are pretty much far fetch idea. Nonetheless, the man himself went tirelessly (and possibly drove penniless) to prove his “thing”. Tesla’s destitute demise contrary to Edison’s prosperous life strikes me really hard. I nearly gave up on the dreams to study engineering; thinking I should be a business man or middle man making the in-between of deals.
While growing up, I did get my stab at making a wireless power transfer kit; reading up various recipes from various sources such as text books, “cook books” from BBS etc. , proving the materials read. At that time, I can’t even differentiate the difference between a normal copper wire and enameled wire. Both look the same to me. Without a master to guide in the field of making wireless power transfer (inductive charging) works; many failures afterwards, I came to a conclusion that probably I am better off hitched to my computer (intel 486).
Recently, while doing some read up on “Qi” the inductive power (wireless charging) standard for smartphones, suddenly I realized this might be the perfect time where the inductive charging technology has matured for end users like me to toy on the idea.
I have this idea of making novelty jewelry for the missus: wearable electronics of some sort with wireless power transfer aka inductive charging. The concept story board goes this way: At a seeming random event, I would have a little girl present her with a nicely decorated box that contains the novelty jewelry I made, with a message asking her to “follow the rabbit”. Hopefully the design of the jewelry would be very tempting such that she would put on straight away. Then a rabbit inspired character would walk pass her and hopefully, she would pick up the subtle message of following the rabbit. While following the rabbit, she will come across a few interesting characters that are staged, and the last character to appear will be me. Naturally, we would reach out to each other. Me, being the techie would have the transmitter end of inductive charging well hidden in my hand, and hook up to a ubiquitous disguised mobile power supply that supplies 12V, 1A.
Out of sudden (it is just a matter of time/distance for the EM fields to resonates between the tx loop and the rx loop), her novelty jewelry will light up and the light intensity grew greater as we are moving closer to touch! YESS! You are the one! Both of us will proclaimed. That’s the perfect time for me to take a knee, standby with a unique marriage proposal ring. 

What else? Propose to her!! this engineered piece of art definitely will work. Trust me, I am an engineer.
Oh waittttttttttttttttttttttttttttttttt…………I don’t have a missus/wife/GF yet.
This instructables assumes the following parts.
1x wireless charging kit. I got one set that is Chinese made at 13USD from aliexpress.
1x apparatus with RGB LED fading PCB of some sort, which consist of a microcontroller such as Arduino or ATTiny85, a RGB LED and a custom PCB or veroboard. A tutorial to program ATTiny85 with Arduino is available and the necessary ATTiny ISP shield can be made too.
There are many derivatives floating on the Internet. I have used my own recipe of ATTiny85 with RGB. The
step by step guide of “cooking” a PCB of your own is available here.
1x 5mm side glow fiber optic sufficient to cover the perimeter of the wearable apparatus of choice.
1x 3D printed custom made jewellery to hold the electronics, rx loop, and fiber optic. I have chosen to use a 3D printed bangle. The STL is available here. Print it twice. The two halves are snap fit. It was done in sketchup with the help from xinteng a DCPE yr1.
Step1: Assemble the RGB fading PCB, program the ATtiny 85 and mount it onto the PCB. Fiber optic cable is then inserted into a 5mm heat shrink tube. The contraption is then inserted to the 5mm RGB led.
Step2: the assembled contraption in the earlier step is then assembled with the 3D printed bangle. The fiber optic cable are is elastic, and should not be bent at sharp angles. It keep slipping out of the 5mm gap designed to hold it, so I have to resort to cable ties to hold them in place.
Step3: Test the contraption with 3V battery to test for functionality
Step4: Assemble the contraption with the RX induction coil and PCB. I have to resort to use some masking tape to keep the wires in place.
Step5: Test the contraption with TX loop connected to DC power supply. The power supply is set to 12V, 1A.
Step5: final check before turning on the DC supply. After turning on the DC supply, observe the behaviour on the EM fields w.r.t to the tx and rx loop. Note: No wireless transfer if the rx and tx loop are orthogonal to each other. The EM fields just cancel each other off.


Look! No batteries needed!

Closeup
here comes the video

Sunday, February 2, 2014

cheat’s Q&D Hadoop 0.23.6 install guide

[CC] cheat’s Q&D Hadoop 0.23.6 install guide
Hadoop is one of the most popular open source “Cloud Computing” platforms that is used to crunch massive amount of data on generic hardware (computer hardware that is non-proprietary and not necessary has to be identical). It is not exactly “Cloud Computing” per se, because it is a computing architecture that is meant for processing massively large amount of data in parallel. Taxonomically, Parallel Computing (the predecessor to cloud computing) would be the closer terminology. Hadoop comes with several features, most notably the HDFS (Hadoop File System), and MapReduce.  I attempt to describe HDFS, and MapReduce in a one liner. HDFS: it is an open source cousin of GFS (Google File Systems), provides a framework to manage data redundancy, and most importantly the scalability is as simple as adding more generic hardware. MapReduce: it is a programming model for processing very large amount of data that leverages on the classic computing method: the divide and conquer approach through the Map stage follow by the Reduce stage. On top of that, it performs sorting intrinsically via the programming model. Oh wait… I busted my one liner quota for MapReduce.
Back in late 2012 I have followed the text book example and played with Hadoop 0.20.0. Setting up and Installation is a breeze, due to the fact that many user guide and tutorials that are made available by the community. In early 2013, Hadoop 0.23.6 come by and I assumed the installation is going to be identical to the earlier version, but I was wrong. As a matter of fact, I have used some nonstandard way by the tree command to find the changes in directory for the necessary configuration files. If the version documentation rocks at that time, it will really save me some of my hair.

Hadoop 0.23.6 is an interesting release. In this version, several major changes/overhaul are made. Most notably, the API call of HADOOP.MAPRED is deprecated and superseded by HADOOP.MAPREDUCE aka
MRv2. Resource management of a Hadoop Cluster was relegated to a dedicated service named YARN. Several advanced data structures meant for programming MapReduce were added; some were deprecated (I will go into the details of implementation in the future posts).
For a complete genealogy of Hadoop versions, check
this out.
This install guide assumes
  1. Ubuntu server 11.x on a VM; I have used 40GB for a start, but run out very quickly.
  2. Hadoop 0.23.6 in releases
  3. Java 6 openJDK
  4. Hadoop cluster lives as a single node
Several things to take note prior to running Hadoop. Locate the directory of configuration files; differentiate between datanode and namenode; dedicate a “Hadoop user”; necessary files permission on the directories; HDFS is not your regular file system, it requires a separate software to access; Hadoop starts with a daemon;

Step1: Download Hadoop and extract it to a directory.
Then name of the directory with the files extracted shall be used in all of the following config. E.g I have created a folder “/usr/local/hadoop” and the files are extracted in it.
Step2: Locate the configuration templates, and directory to place the configurations
#template
/usr/local/hadoop/share/hadoop/common/templates/conf
#path to place configuration files
/usr/local/hadoop/etc/hadoop
Step3: create directory for temporary files, logs, namenode, and datanode
/usr/local/hadoop/data/hdfs/datanode
/usr/local/hadoop/data/hdfs/namenode
/usr/local/hadoop/data/hdfs
/home/user/hadoop/tmp
#output for hadoop logs
/user/user
Step4: copy example configuration templates to config directory and then edit the configuration files.
Configuration files needed are “yarn-site.xml;core-site.xml;hdfs-site.xml;mapred-site.xml”. Put in the parameters as per required in to the configuration files mentioned above. A sample of the configured configuration files are available to download here.
Step5: add the necessary paths and verify the paths
#path to add to ~/.bash
$JAVA_HOME=/usr/lib/jvm/java-6-openjdk
$HADOOP_HOME=/usr/local/hadoop
# update the paths
source ~/.bash
#Verify  the paths
#output should be similar to the following
share/doc/hadoop/api/org/apache/hadoop/examples
/usr/local/hadoop/hadoop/hadoop-0.23.6/share/doc/hadoop/api/org/apache/hadoop/examples
/usr/local/hadoop/share/hadoop/hadoop-0.23.6/share/doc/hadoop/api/org/apache/hadoop/examples
/usr/local/hadoop/share/doc/hadoop/api/org/apache/hadoop/examples
/usr/local/hadoop/share/hadoop/hadoop-0.23.6/share/doc/hadoop/api/org/apache/hadoop/lib
/usr/local/hadoop/share/hadoop/hadoop-0.23.6/share/hadoop/mapreduce/hadoop-mapreduce-client-core-0.23.6.jar
/usr/local/hadoop/share/hadoop/hadoop-0.23.6/share/hadoop/mapreduce/hadoop-mapreduce-client-common-0.23.6.jar
Step 6: Format name node
Warning: this step only requires to be done ONCE for each newly setup cluster. Executing this command on an existing cluster will risk data loss.
#do once only, at the initial setup of hadoop
bin/hadoop namenode –format
Step 7: Start the daemon for Hadoop
sbin/hadoop-daemon.sh start namenode
sbin/hadoop-daemon.sh start datanode
sbin/yarn-daemon.sh start resourcemanager
sbin/yarn-daemon.sh start nodemanager
sbin/mr-jobhistory-daemon.sh start historyserver
Step8: verify Hadoop cluster with jps
Assumed that the setting up and configuration went fine, the following screen will appear after typing the command “jps”.

Step9: verify Hadoop cluster with web based consoles
Note: 192.168.253.130 is the IP address of my Ubuntu server.
#namenode console to verify o/p
#for ResourceManager
#for Job History Server
Step10: Verify Hadoop & MapReduce in action
run example word count
#copy text files from “/home/user/upload” to HDFS directory “/user/user/txt”
bin/hadoop dfs -copyFromLocal /home/user/upload /user/user/txt
bin/hadoop dfs -ls /user/user/txt
bin/hadoop jar share/hadoop/mapreduce/hadoop-mapreduce-examples-0.23.6.jar wordcount /user/user/txt /user/user/txt-output
calculate pi
#run an example calc pi
bin/hadoop jar share/hadoop/mapreduce/hadoop-mapreduce-examples-0.23.6.jar pi -Dmapreduce.clientfactory.class.name=org.apache.hadoop.mapred.YarnClientFactory -libjars share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-0.23.6.jar 16 10000
Compile a custom word count in java with MapReduce on Hadoop 0.23.6
#to compile
javac -classpath /usr/local/hadoop/share/hadoop/hadoop-0.23.6/share/hadoop/common/hadoop-common-0.23.6.jar:/usr/local/hadoop/share/hadoop/hadoop-0.23.6/share/hadoop/common/lib/commons-cli-1.2.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-common-0.23.6.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-core-0.23.6.jar -d classes WordCount.java && jar -cvf wordcount.jar -C classes/
#to execute
/usr/local/hadoop/bin/hadoop jar wordcount.jar org.myorg.WordCount /user/user/txt /user/user/bigram-output

Verify output with
/usr/local/hadoop/bin/hdfs dfs -ls /user/user


Saturday, November 30, 2013

RGB colour cycle Arduino

cycle the colour wheel using RGB LED with Arduino PWM pin Digital Write cycle the colour wheel using RGB LED with Arduino PWM pin Analog Write http://www.arduino.cc/en/Tutorial/ColorCrossfader

Friday, November 15, 2013

sixpence 3D scanning kit

While I was spending my summer in London typing away on my thesis. One of my extra curricular activities was to pop over next door Institute of Making to make some interesting stuff. At one of the workshops, we did some 3D scanning using both open source such as reconstructme + kinect, and also proprietary solution, the Next Engine. Scanning a human object using the kinect at the absent of a scanning rig is really tiresome. Holding the laptop, the kinect, and power supply circling the subject at incremental steps is tedious. Nonetheless, here I present to you: yours truly in meshlab.


Over the weekends, I thought of an idea of making dirt cheap 3D scanning with existing items. What I mean existing items, are items on my desk such as an android mobile phone, arduino, and servo. While researching on cloud computing and it's application, I discovered a really cool website http://apps.123dapp.com/catch/ that leverages on cloud computing to generate a 3D model based on multiple pictures of an object. Taking (at most 70) pictures of an object at 360 degrees manually without a rig is really tiring. So, my weekend project for the 3D scanning kit to automatically take pictures at 360 degrees of a subject without human intervention; can be decomposed to 4 sub parts. part1: I need a turntable of some sort to rotate my subject 360degress. part 2: There must be some sort of communication channel between my turntable and the picture taking apparatus. part3: picture taking apparatus must be capable of receiving commands. part4: upload pictures to 123D catch to generate the 3D model.

part1: turntable
turntable with subject container

manually to take pictures
if you wonder what is the pen doing there


Parts needed. An arduino, full rotation servo, code.
The full rotation servo (FRS) I got on hand was picked up from a rubbish dump. Upon testing, it is still functioning, how lucky. Here comes the interesting problem. With the use of the example code of sweep from arduino, the FRS is behaving erratically. It does not stop exactly at 15 degrees and continue to spin. Reason being, the servo is modified; the "horn" on a gear inside the servo is broken off. tough luck using standard code. So, I have to come out with a scheme to stop the FRS at every 15degrees via code.

As for the container of the subject. I have used newspaper to create the background for the subject. Such that when the 3D model generating algorithm is running, the patterns on the newspaper can be used as the reference point. That is according to the guide of the 123D catch.

Part2: communication
parts needed: android device (API level 17 onwards), OTG cable
Reluctant quite I am, to purchase a bluetooth shield for arduino for communication. Furthermore, I am using an android phone running android 4.3 (API level 19). In this particular version, it supports direct USB connection from say a keyboard or mouse to the android phone via microUSB or OTG cable ( USB typeA female to microUSB male). It is much more cost effective for me to use OTG than the bluetooth shield.

A quick look at the opensource community, I stumble upon this github https://github.com/dtbaker/android-arduino-usb-serial i believe was forked from https://code.google.com/p/usb-serial-for-android/. Many thanks to the open source contributors for allowing me to quickly try out code for USB serial from android <--> arduino. Just a point to note, the baud rate for the android is 115200, so arduino must setup serial at the same baudrate. 

combining part1 and 2, I have devised a scheme for my 3D scanning kit. Arduino turn the turntable every 15degrees, send ASCII characters to android device to signal for taking a picture
The code for arduino is here 
Another point to note: print out the serial data received on android to prove the assumption that it is going to be the same as per received on hyper terminal. I learnt it the hard way.

Part3: multiple picture taking on android device without human intervention.

There are excellent tutorials such as this and this for writing manual code to use the android device's camera to take ONE picture. Having wrote my last android app from scratch on my HTC magic, android 1.6 (API level 4), I assumed that I would have not any issues using the API for android 4.3 (API level 19). Besides that, having use MIT app inventor for mockup and POC without writing code from ground up, and using the standard features following the standard methodology; left me jaded when it comes to developing android app.

The SOP for taking a picture on android device via camera API is quite straightforward. Create an activity. Add a button to listen to a an event to take a picture. Add a view to the frame layout for the preview from the camera. Save the picture to the device's memory. After picture is taken, refresh the preview. I assumed that I would only spend 4 hours max after office hours to write a piece of code that would automatically take multiple pictures without user intervention (nobody click on the button to take picture). Little did I for see I would stare at the code for a few nights, wrestling with the android code framework finding out where are the crashes; due to the nature of pictureCallBack(), onPictureTaken(), and refresh preview were supposed to be used. The experience and amount of code I have tried to challenge my assumptions such as race condition, critical section, multi threading that I thought might be the root cause of the crashes warrant for a lengthy post by itself.

Nonetheless, after staring and experimenting for few nights straight, i present to you The code of this android app that is hosted on github

Part 4: upload pictures to 123D catch

Combing part1,2,3, setup a stand for the android device to take pictures.

copy the 31 images from android phone to be uploaded to the 123D catch

Generate a 3D model from the pictures uploaded

Note: no model generated (I got a blank screen after the supposed completion of 123D catch[online]), and I have waited close to 30min to save the project, but without success.
Edit: I have tried to take a few shots of the same subject manually and upload to 123D catch, just to prove my assumption (pictures taken by my app is not usable) is wrong. Surprisingly, no models generated too. really weird

Some fine tuning is required: I noticed the picture taken by my android device was out of focus. Maybe that is the reason why the model was not generated. pictures generated from sixpence 3D scanning kit do work!

Edit: I placed my subject too close to the lens, hence the depth of field caused the blurry images akin to blur subject and clear background. I am still trying to find the API that allows for macro mode auto focus.

Edit: For some weird reasons, 123D catch online version does not work on my laptop. I have left it running over night and the next day I check my computer, still no model generated. However, the 123D offline version does work, using the pictures generated by sixpence 3D scanning kit.
uploading pictures
processing the capture into 3D model
sitting there looking pretty


THIS IS SPARTAAAAAAAAAAAAA!!!!


A quick view in meshlab. further manipulations are needed before 3D printing. For starters, the newspaper background got to go.



update:
the 3D model opened in meshmixer. the news paper portion is selected and then deleted by pressing key "x"


there are a few gaps need to be fixed in the edited 3D model before it can be printed.

the 3D model is edited into a watertight model, ready for the 3D printer.
the 3D printed model