-
Notifications
You must be signed in to change notification settings - Fork 1
Open Source Research
Joshua Bell edited this page Feb 4, 2023
·
3 revisions
This section of the wiki contains studies which have been open-sourced or have publicly released some/all of their assets. For the moment, all such studies will be included on this page. Once the collection reaches a size that makes this approach unwieldy, they will be categorized based on some characteristic, such as the software platform. If you would like a study to be included, feel free to reach out or post an issue (preferably with this template filled out).
Transferring paradigms from physical to virtual reality: Can reaction time effects be replicated in a virtual setting?
- Tags: vision, real-virtual-comparison, RT
- Authors: Michael Wiesing, Hendrik Steinkönig, Simone Vossel, Gereon R. Fink, Ralph Weidner
- Date: 10-2022
- Method Summary: Presented paired visual stimuli (horizontally-aligned gratings of varying color, orientation and spatial frequency) to participants on both a physical and virtual monitor, with participants identifying the stimuli as either "same" or "different" in terms of spatial frequency using a pair of NataTech buttons. Recorded reaction time and accuracy.
- Software Platform: UE4 (4.22)
- Framework: Custom
- Hardware Platform(s): OpenVR, NataTech
- HMD: HTC Vive
- Tracking/Response hardware: 2x Vive controllers, 2x Natatech button pads
- Assets Available: A fully textured 3D lab scene with baked lighting (video). A set of scripts that handle the controlled presentation of visual stimuli and data collection. Collected data and analysis scripts.
- Relevant Links:
- Github (Experiment) (Note: Requires Epic Games account with UE access)
- OSF (Analysis & Data)
- Paper