Magic Leap Spacial Computing Integration Project. This is an experimental display gallery to understand scale, positioning, and markup.
This project contains Magic Leap compatible content viewable with the Helio browser.
begin clorox wipes
begin IRL microwave
begin IRL dishwasher
begin wind turbine
Why We are Making this Project
This is a project used to demonstrate and learn how to use spatial computing in everyday applications. This project contains an evolving collection of objects that can be observed, extracted and placed using the Magic Leap One. makeSEA is the premier content management and delivery platform for Magic Leap. To learn more and become a beta user please Contact Us.
Other Experimental Projects:
This project focuses on content designed for delivery via the Helio browser using Prismatic scripting and markup.
- ML Remote
- World Sensing
- Wold Mesh with Occlusion
- Headpose-based Object Placement
- Near Clip Plane Exploitation*
- Helio Browser
*considered a spatial computing no-no and reality-breaker but, useful for certain technical applications
https://www.script-tutorials.com/webgl-with-three-js-lesson-7/ ---how to do inline previews for various types
requires (notes are fuzzy on build fixes but should be correct)
try OOTB build first, if errors:
in order for this to build on mac with unity env the scripts with names including Test need to be moved under directory named "Editor":
move /Assets/Resources/*Test* /Assets/Editor
move /Assets/Unity/Scripts/Tests/*Test* /Assets/Unity/Scripts/Tests/Editor (dir should already exist()
- Works well with photos even if the photos are a little bit blurry.
- Once the sparse point cloud is created, the next best thing to do is to change the bounding box to just your subject. In this case a stove. That way it will be able to still use the background noise to help create the dense point cloud, but all of the points outside of the bounding box will n
- 3DF Zephyr takes a long time to extract photos from the video.
- At 1 frame per second capture rate, the program was able to obtain 87 photos from the video that is 01:23 minutes long.
- Out of those 87 photos only BLANK were able to be used.
- I was able to increase this number to 70 by changing the sparse cloud from default to deep.
- Sparse point cloud and the dense point cloud created from just taking photos are a lot more detailed with less holes when creating the mesh
- Install Blender and change the initial start up to have nothing in it
- Delete the initial cube then save the space as the start up
- Save the convert_stl_to_fbx.py from the assets of this project to the folder Blender under Blender Foundation.
- In the command prompt navigate to the same folder as above, then input the following:
blender --background --python convert_stl_to_fbx.py -- test.stl test.fbx
- Where the test.stl will be the location of the stl file you wish to convert, and test.fbx will be the location of the converted file.