#vr

danie10@squeet.me

How many years do we need to be told VR is the future before it actually takes off?

A woman outlined in a neon coloured halo. She is wearing a VR headset on her head.
Virtual reality has been close to mainstream adoption for decades, yet it remains a niche technology. While promising developments and incremental improvements continue, the game-changing app that will propel VR into every home remains elusive despite the efforts of tech giants like Meta, Google, and Apple.

History is littered with VR failures, from the early Nintendo Virtual Boy to the pricey Apple Vision Pro, but the promise of immersive digital worlds keeps companies investing billions. The stakes are high, as whoever unlocks mass-market VR will shape the future of the user experience across gaming, fitness, productivity, and beyond.

I’m very sure that VR will be an amazing and immersive technology. I was only about a month a way from buying a Quest headset myself, when Meta bought them out, and I dropped that idea very quickly. I’ve clung onto my 3D TV because I’ve always enjoyed watching movies in 3D.

But, for me, VR really needs two things to happen:

  • Pricing needs to be affordable (not Apple $3,500).
  • It needs some killer apps that take it to the next level and are truly immersive and compelling to use (the apps need to pull users in).

I don’t even think it is about having the very highest of resolutions, as the 3D stereo effect will still reel people in. And there are some who seem to experience problems wearing VR headsets for longer periods, and we probably need to understand why that is.

See androidpolice.com/when-will-vr…
#Blog, #technology, #VR

danie10@squeet.me

BrickMasterVR – Building Lego using Oculus Quest VR Glasses

A screenshot of a virtual reality view of a Lego house being built. In the foreground is a hovering cube that shows the words Parts Selector nest to it.
I know someone who builds Lego for therapy. He buys sets on an ongoing basis, some of which are well over 10,000 pieces (think of the Titanic). They take up a mass of space. I get that some sets are great for physical display, but it also gets costly building and giving them away.

There is already online free virtual Lego building with Bricklink Studio, but has none of the “handling” of the pieces nor seeing it in proper three dimensions.

The logical direction to take this is towards VR glasses type solutions, where you can see it in 3D (as if you were looking at the real model) and be able to pick up and place pieces (using virtual hands). A partnership between Lego and Apple would also have been a logical conclusion for this.

But, as far as I know, there is still no real working solution for this, despite lots of interest in it.

The video linked below shows an early prototype in action, but that was about 3 years ago. It was also discussed on Reddit in r/OculusQuest back then.

See youtube.com/watch?v=Vrm0V6goFl…
#Blog, #lego, #technology, #VR

rhysy@diaspora.glasswings.com

I still have no idea what Gaussian splatting is, but on a quick test the results are impressive. The app is rough around the edges - everything seems to bounce slightly as the headset moves in a way that doesn't happen normally. But the visuals are good. Some surfaces, like with ordinary photogrammetry, clearly haven't scanned as well as others. Fine details like the grill on a fan are also largely lost. But on the other hand, the overall level of detail, the number of objects and the precisions with which they're captured, is well above what I'd expect was possible natively on a headset.

#VR

https://www.roadtovr.com/meta-horizon-hyperscape-photorealistic-app-quest-3-s/

rhysy@diaspora.glasswings.com

Sounds extremely strange. I'd like to try it.

The user wears a VR headset with a 360° camera perched on their head. This camera has a fisheye lens on the front and back, and stitches the inputs together to make a 360° panorama. The headset shows the user a segment of this panorama as a normal camera view, but the twist is that the effect from turning one’s head is amplified.

Turning one’s head 45 degrees to the left displays as though one’s head turned 90 degrees, and turning 90 degrees (i.e. looking straight left) displays the view directly behind. One therefore compresses an entire 360 degrees of horizontal visual awareness into the normal 180 degree range of neck motion for a person, without having to resort to visual distortions like squashing the video.

#VR
#Perception

https://hackaday.com/2024/07/16/giving-people-an-owl-like-visual-field-via-vr-feels-surprisingly-natural/

waynerad@diasp.org

"View and explore atomic models in Augmented Reality!"

I don't have a VR headset and I checked and my Android device does not support ARCore, so this is for those of you who can do VR.

"AR Atom Visualizer is an app that allows you to view and explore atomic models in Augmented Reality with Google ARCore on your smartphone. Many of us understand the basic structure of an atom: a nucleus containing protons and neutrons, surrounded by electrons - but how are those electrons organized? How do they move? What do they look like?"

"The Bohr model presents the atom as a shell containing a nucleus and the electrons in orbit around it. It helps us understand the energy level of electrons and how they are organised in relation to the nucleus."

"The quantum mechanical model presents the atom as an electron cloud. This helps us understand the possible location of the electrons in relation to the nucleus."

"AR Atom Visualizer uses Augmented Reality to create 3D animated visualizations of both these models of any atom in real space, just by using the camera on your smartphone."

AR Atom Visualizer - Growing STEM talent in the Signal Garden

#solidstatelife #vr #chemistry

rhysy@diaspora.glasswings.com

My poster for EAS 2024. It should work publicly now, though the official poster gallery doesn't seem to be up yet. Anyway, this continues my efforts to promote Blender-based data visualisation in astronomy. Next week I'll recycle and update a talk covering actual science.

#Astronomy
#Blender
#DataVisualisation
#3D
#VR

https://k-poster.kuoni-congress.info/eas-2024/poster/803490a3-04a4-46ca-9094-22a296467039

waynerad@diasp.org

"Full-body haptics via non-invasive brain stimulation."

"We propose & explore a novel concept in which a single on-body actuator renders haptics to multiple body parts -- even as distant as one's foot or one's hand -- by stimulating the user's brain. We implemented this by mechanically moving a coil across the user's scalp. As the coil sits on specific regions of the user's sensorimotor cortex it uses electromagnetic pulses to non-invasively & safely create haptic sensations, e.g., touch and/or forces. For instance, recoil of throwing a projectile, impact on the leg, force of stomping on a box, impact of a projectile on one's hand, or an explosion close to the jaw."

Hmm somehow I don't think everyone's going to be doing this any time soon. Or maybe I'm wrong and this is the missing piece that will let VR take off? Let's continue.

"The key component in our hardware implementation is a robotic gantry that mechanically moves the TMS coil across key areas of the user's scalp. Our design is inspired by a traditional X-Y-Z gantry system commonly found in CNC machines, but with key modifications that allow it to: (1) most importantly, conform to the curvature of the scalp around the pitch axis (i.e., front/back), which is estimated to be 18 degrees for the sensorimotor cortex area, based on our measurement from a standard head shape dataset; (2) accommodate different heads, including different curvatures and sizes; (3) actuate with sufficient force to move a medical-grade TMS coil (~1 kg); (4) actuate with steps smaller than 8.5 mm, as determined by our study; and, finally, (5) provide a structure that can be either directly mounted to a VR headset or suspended from the ceiling."

"We feature three actuators, respectively, to move the coil in the X- (ear-to-ear translation), Y- (nose-to-back translation), and Z- (height away from the scalp) axes. Since the X-axis exhibits most curvature as the coil moves towards the ear, we actuate it via a servo motor with a built-in encoder, this offers a reliable, fairly compact, strong way to actuate the coil."

"To account for curvature, the Z-axis needs to be lifted as the coil traverses the head."

"We used a medically compliant magnetic stimulator (Magstim Super Rapid) with a butterfly coil (Magstim D70)."

"We stimulated the right hemisphere of the sensorimotor cortex (corresponding to the left side of the body) with three consecutive 320 microsecond TMS pulses separated by 50 ms, resulting in a stimulation of ~150 ms. While we opted to only stimulate the right side of the brain to avoid fatigue, the results will be generalizable to the left side."

"We identified two locations on the participant's scalp that yielded minimum stimulation intensities to elicit observable limb movement (i.e., motor threshold) for the hand and foot."

"For each location, the intensity was set to 10% below the hand's motor threshold. The amplitude of TMS stimulation was reported in percentage (100% is the stimulator's maximum). During a trial, the experimenter stimulated the target location. Afterward, the participant reported the strongest point and area of a perceived touch as well as a keyword (or if nothing was felt). Then, the experimenter increased the intensity by 5% while ensuring the participant's comfort & consent, and moved to the next trial. This process continued until the participant reported the same location and same quality of sensations for two consecutive trials, or the intensity reached the maximum (i.e., 100%)."

"After each study session, we organized the participants' responses regarding touch sensations based on where the strongest point of the sensation was. We also annotated each trial to indicate which of the following body parts moved: 'none', 'jaw', 'upper arm', 'forearm', 'hand' (i.e., the palm), 'fingers', 'upper leg' (i.e., the thigh), 'lower leg', and 'foot'."

"Results suggest that we were able to induce, by means of TMS, touch sensations (i.e., only tactile in isolation of any noticeable movements) in two unique locations: hand and foot, which were both experienced by 75% of the participants."

"Our results suggest that we were able to induce, by means of TMS, force-feedback sensations (i.e., noticeable involuntary movements) in six unique locations: jaw (75%), forearm (100%), hand (92%), fingers (83%), lower-leg (92%), and foot (92%), which were all experienced by >75% of the participants. In fact, most of the actuated limbs were observed in almost all participants (>90%) except for the jaw (75%) and fingers (83%). The next most promising candidate would be the upper leg (58%)."

Surely they stopped there and didn't try to do a full-blown VR experience, right? Oh yes they did.

"Participants wore our complete device and a VR headset (Meta Quest 2), as described in Implementation. Their hands and feet were tracked via four HTC VIVE 3.0 Trackers attached with Velcro-straps. Participants wore headphones (Apple Airpods Pro) to hear the VR experience."

"Participants embodied the avatar of a cyborg trying to escape a robotics factory that has malfunctioned. However, when they find the escape route blocked by malfunctioning robots that fire at them, the VR experience commands our haptic device to render tactile sensation on the affected area (e.g., the left hand in this case, but both hands and feet are possible). To advance, participants can counteract by charging up their plasma-hand and firing plasma-projectiles to deactivate the robots. When they open the palm of their hands in a firing gesture, the VR experience detects this gesture and prompts our haptic device to render force-feedback and tactile sensations on the firing hand (e.g., the right hand in the case, but both hands can fire plasma shots). After this, the user continues to counteract any robots that appear, which can fire shots against any of the user's VR limbs (i.e., the hands or feet). The user is shot in their right foot -- just before this happens, the VR prompts our haptic device to render tactile sensation on the right foot. After a while, the user's plasma-hand stops working, and they need to recharge the energy. They locate a crate on the floor and stomp it with their feet to release its charging energy. Just before the stomping releases the energy, the VR commands our haptic device to render force-feedback and tactile sensation on the left leg. The user keeps fighting until they eventually find a button that opens the exit door as they are about to press it, the VR requests that our haptic device render tactile sensation on the right hand. The user has escaped the factory."

Full-body haptics via non-invasive brain stimulation

#solidstatelife #vr #haptics