You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
My friend cannot see holograms fine, but when she gets close, she says, 'I see!' She was very impressed.
In fact, I have done a magnified view combined with the eye tracking feature and he said the results were very good and much easier to see. (Although my implementation was forced and not something I could use on a daily basis, considering the load, etc.)
I find this feature very appealing and also fascinating as I have found myself wanting to employ a zoom display at times due to resolution issues and the user's environment.
I think this feature would contribute greatly to the accessibility of MixedReality applications.
I'll be the first to try this feature when it's ready and share it along with my efforts with fellow developers and many others at community events like the TokyoHoloLens Meetup etc.!
I'll be the first to try this feature when it's ready and share it along with my efforts with fellow developers and many others at community events like the TokyoHoloLens Meetup etc.!
This all sounds wonderful, thank you @HoloMoto! Yes, we would love any feedback or learnings you found when working on tools to improve vision for low vision people. (Thanks for the blog post link too, I'll read it.)
Accessibility is something we want to put more emphasis in for Mixed Reality tools.
Describe the problem
Graphics Tools should supply a tool to help magnify the screen in stereo rendering contexts.
Describe the solution you'd like
1. Create URP Render Feature to Blit a Zoomed in Frame Buffer in XR
Great starting place for how to do this: How to perform a full screen blit in Single Pass Instanced rendering in XR | Universal RP | 13.1.7 (unity3d.com)
Once you have this working in a simple test scene look at:
2. Create Component to Add & Configure Zoomed In Render Feature
The Unity component should allow end users to configure in the Unity inspector (and at runtime):
3. Build a Zoom Sample Scene
Add a sample scene to MRGT which shows how to configure the zoom component and display it in the world.
4. Implement Post Processing Options for the Zoom
This will require experimentation on what is feasible. But some initial ones to test:
Ideally these happen in the same shader pass as the zoom (to avoid extra passes) and can be toggle on/off at runtime via the zoom component.
Describe alternatives you've considered
n/a
Additional context
Support for URP only is fine.
Could be similar to the bifocals in SeeingVR: SeeingVR: A Set of Tools to Make Virtual Reality More Accessible to People with Low Vision - Microsoft Research
The text was updated successfully, but these errors were encountered: