Jeanyoon Choi

← Back to all posts

Proposal - The Innovative-Interactive-Immersive-Inspirational Metaverse

Updated: 7/16/2024

The Innovative-Interactive-Immersive-Inspirational-Intelligent-Imaginative-Inclusive-Integrated Metaverse: Proposal to
Immersiveness
How to reconstruct HMD-driven immersiveness within mobile environment
Pointing, Pinching, Zooming → Scrolling, Pinching, Zooming
Ian Cheng-like Interactions?
Dadaistic representation/reconstruction of Immersiveness within Mobile
Metaverse: Criticism? Via Repetition & Exaggeration? 
Just like Portfolio Website

The original Idea: How to bring Interactive Immersiveness within the Multi-Device Web Artwork setting? How to reconstruct HMD-Driven immersiveness within mobile environment? Pointing, Pinching, Zooming (Apple Vision Pro interactions) will act as Scrolling, Pitching, and Zooming? (Within Mobile) Also can this act as a dadaistic representation of the criticism of how everyone talks about metaverse - before even doing it seriously?

The Innovative-Interactive-Immersive-Inspirational-Intelligent-Imaginative-Inclusive-Integrated Metaverse is a Multi-Device Web Artwork developed to expand the means of interactive immersiveness beyond traditional HMD/VR/AR setups while critiquing the exaggerated promises of future technologies. To reflect this critique, this residency project revisits the fundamental principles of interactive immersiveness, aiming to deepen our understanding of virtual worlds through medium-based research.
The project uses five channels: four screens (monitors/laptops) and one mobile. The four screens are positioned perpendicularly around the audience to create a 360-degree immersive environment. Unlike HMDs, where the screen is attached to the eyes, these screens surround the body, enhancing the real-world setting. The audience interacts with the environment through their mobile phones by scanning a QR code to access a customised website, which is connected to the screens via WebSocket in real-time. Mobile interactions control the content on the four screens simultaneously, similar to the relationship between a VR Headset and its controller. The phone’s orientation, tracked via its accelerometer, changes the camera rotation of the virtual world, while touch interactions on the mobile UI navigate the virtual world, acting like thumbsticks.
This setup offers a unique experience distinguishable from traditional HMD setups, by breaking down existing norms and practices in constructing VR worlds and situating them within a multi-device web artwork. Thus, it provides an opportunity to reflect on the notion of interactive immersiveness within the virtual world to all developers, designers, artists, and audiences. Simultaneously, it presents the foundation for a new medium beyond traditional VR/AR, presenting somehow more accessible and immersive format. While this multi-device web artwork can be categorised as augmented reality, it is distinguishable from web-based single-device AR.
The Innovative-Interactive-Immersive-Inspirational-Intelligent-Imaginative-Inclusive-Integrated Metaverse, as indicated by its name, critiques the flashy adjectives associated with future technology.  During the pandemic and beyond, we have seen how myopic capitalism is linked with illusory promises of yet-to-be-realized technologies, packaged with inflated entrepreneurship and techno-utopianism. This project criticises such situations, arguing for a more steady, detailed, and professional approach to each technology, similar to how this project deeply researches the notion of interactivity from an artistic perspective.
The virtual world structure for this project includes about 400 conference rooms arranged in a grid on the XY-plane, with 20 rooms along the X-axis and 20 along the Y-axis. Each row of the X-axis represents an 'I'-starting adjective (Interactive-Immersive-Inspirational-Intelligent, etc.), while each column of the Y-axis represents a technology (Blockchain-NFT-Web3, etc.). Each conference room at the intersection of a row and a column has a unique conference agenda (e.g., The Immersive Metaverse). As the audience enters a room, a conference begins, with chairs arranged towards the top, and Stable Diffusion-generated images/videos presented at the ceiling mimic the professional, futuristic aesthetic often presented by technocrats. Additionally, ChatGPT and TTS-generated spatial audio present the unfulfilled promises of future technologies. The exact content will be further designed during the residency period.
The full word list includes Interactive-Immersive-Inspirational-Intelligent-Imaginative-Inclusive-Integrated-Intriguing-Intuitive-Infinite-Innovative-Invigorating-Inventive-Involving-Illuminating-Illustrative-Impactful-Insightful-Illuminative-Interconnected-Interdisciplinary-Independent-Ingenious-Industrious-Introspective-Blockchain-NFT-Web3-AI-VR-AR-MR-IoT-BigData-MachineLearning-Metaverse-Cybersecurity-5G-QuantumComputing-EdgeComputing-CloudComputing-Robotics-Automation-Biotech-FiTech-SmartContracts-DigitalTwins-AugmentedReality-ExtendedReality-WearableTechnology-3DPrinting-Genomics-Nanotechnology-SmartCities-GreenTechnology-AutonomousVehicles-DistributedLedger-ArtificialGeneralIntelligence-NaturalLanguageProcessing-DeepLearning.
Technically, this project uses Next.js and React.js for the frontend framework, with the virtual world constructed via Three.js. Inter-device interaction is implemented via WebSocket, allowing real-time control of the monitor scenes from mobile interactions.


Text written by Jeanyoon Choi

Ⓒ Jeanyoon Choi, 2025