Abstract
Human perception is naturally limited to three-dimensional geometry, raising the question of whether it is possible to train the brain to perceive higher dimensions. While the properties of fourth or higher-dimensional hypersolids can be described through mathematical topology this paper explores an alternative method to possibly enhance human understanding of higher-dimensional spaces. Building upon previous work, such as the holographic visualization of hypersphere projection, this paper introduces the development of a novel virtual reality (VR) application that employs a color-coded hypersphere intersection resulting while the hypersphere intersects with the three-dimensional space. This innovative method offers a refined and visually enhanced representation, evolving from prior techniques, to create a more immersive user experience. The Hypersphere Perception VR Interactive Application is finally developed employing ChatGPT 4. This educational application could enable users to interact with higher-dimensional geometry in a virtual environment, enhancing spatial reasoning and cognitive abilities. It is designed to work with both computers and mobile devices.
Keywords:
Hypersphere, Virtual Reality, Augmented Reality, Brain Training, Higher Dimensions
- Introduction
A space with more than three spatial dimensions is called hyperspace. In mathematics, the dimensionality of a space is determined by its ability to be divided into two parts by another space of one dimension less. For example, an 1D (one-dimensional) line can be divided by a 0D (zero-dimensional) point, a 2D (two-dimensional) plane can be divided by an 1D line, and a 3D (three-dimensional) space can be divided by a 2D plane. A 4D hyperspace can be divided in two parts by a 3D space. Note that within a xD space, x coordinates are needed to identify any given point, e.g., in a 4D hyperspace four distinct coordinates can uniquely identify the position of any given point [1].
- Definition of the 4D hypersphere
In a 4D hyperspace, all the points that share equidistant distances from a designated point, termed the 'center,' establishes the concept of a '4D hypersphere.' This definition draws a parallel with the way a circle is defined in a 2D plane, and a sphere is defined in a 3D space. Within the context of this article, we will use the term 'hypersphere' to refer to the '4D hypersphere'. It is worth noting that this exact definition can be extended to describe hyperspheres in spaces with dimensions greater than four [2].
- Interactions of the hypersphere with 3D space
To understand how the hypersphere interacts with 3D space we can study how a 3D solid interacts with a 2D plane and use the same methodology for the hypersphere. There are three interactions of a 3D solid with a 2D plane: Projection, Intersection and Development.
We can imagine the projection of the hypersphere onto 3D space and its intersection in a manner analogous to how a sphere can intersect with a 2D plane. Furthermore, we can consider the development of the cube, or other 3D solids, as a third interaction with a 2D plane. However, it is impossible to develop a hypersphere in 3D space, just as a sphere cannot be directly developed into 2D space [3]. Instead, we employ the Intersection Method to design and develop a VR application that will demonstrate the intersection of the hypersphere with the 3D space. We propose that this application may be used to train our brain to comprehend the hypersphere interaction with the 3D space.
- Intersection of the hypersphere with 3D space
To better understand the Intersection Method, we first explain how a 3D sphere intersects with a 2D plane (Figure 1a), and then we explain how the hypersphere can interact with 3D space (Figure 1b).
Figure 1a: Cross-section trace of a sphere passing through a 2D plane. (© 2020 Traperas, Gounaropoulos and Kanellopoulos).
Figure 1b: Cross-section trace of a hypersphere passing through 3D space (the centers of the seven cross-sections are situated on the same 3D space point). (© 2020 Traperas, Gounaropoulos and Kanellopoulos).
- Color-coded hypersphere
*(r,g,b)
Based on Hinton’s method (coloring the edges of a cube) [4], Traperas and Kanellopoulos introduced a method by color coding the coordinates x, y, z of the tri rectangular axis system as R, G, B correspondingly and created the Color-Coded Sphere [5], the depiction of which is shown in Figure 2. Each point on the surface of the sphere is then colored by blending the corresponding RGB absolute values of its three axes.
*(r,g,b)
A fourth dimension may be added as axis W (white). This way we have the four coordinates (x, y, z, w) color-coded as R, G, B, & W correspondingly resulting in the color-coded hypersphere. Note that, the intensity of these four colors is directly linked to the absolute value of the axis coordinate; that is, each point on an axis within the hypersphere is assigned a singular color. This means that each color starts with zero intensity at the axis origin and gradually increases to maximum intensity at the radius distance of the hypersphere.
- Visualization of the color-coded hypersphere
The color-coded hypersphere that intersects with the 3D space has the shape of a sphere that increases in magnitude from a tiny dot (as the hypersphere’s cross-section enters the 3D space), to its maximum size (as the hypersphere’s cross-section reaches radius R), and then decreases in magnitude to a small dot (as the hypersphere’s cross-section exits the 3D space) (Figure 1b).
During the intersection, the intensity of the colors increases gradually, along with the magnitude of the sphere. For a clearer understanding of this method, we choose four different ways of how the hypersphere passes through the 3D space depending on which axis of the four axes. The hypersphere enters the 3D space along a chosen axis, which is perpendicular to the other three axes. Note that its absolute value remains zero. Each point on the surface of the hypersphere is then colored by blending the corresponding absolute values of the remaining three axes. Figure 3 depicts the visualization of the color-coded hypersphere entering the 3D world with the four possible ways.
Figure 3: A hypersphere passing through the 3D space in 4 different ways: along the white, green, blue or red axis perpendicular to the 3D space.
The sphere’s surface coloring in Figures 2 & 3 was achieved by employing four invisible monochromatic R, G, B, & W projectors. These projectors were placed outside the sphere and illuminated its surface along each of the four axes (x΄x, y΄y, z΄z, w΄w) [5].
Additionally, we have explored the visualization of the hypersphere via a holographic art installation [6].
While this approach did not involve interaction with the user, the subsequent goal was to develop an interactive environment utilizing VR technologies. Building on this progression, the present paper introduces a novel interactive method for coloring the sphere's surface, leading to a more refined and visually enhanced representation.
- The Hypersphere Perception VR Interactive Application
Since our brain is trained to comprehend the intersection of the 3D sphere with the 2D plane, we propose that it could be possible to train our brain to perceive higher dimensions.
In addition, by using state-of-the-art technologies, e.g. Virtual Reality (VR) and/or Augmented Reality (AR) technologies, we may design and develop an innovative application that can help individuals develop their spatial reasoning skills and mental visualization by emulating higher dimensions in a virtual environment. Then, users may practice navigating and manipulating complex geometric hyper-solids, by enhancing and improving their cognitive abilities and understanding of abstract concepts [1, 7].
We have proposed to focus on comprehending the hypersphere by employing the color-coded hypersphere intersection method and developed an interactive VR hyperspace educational application. Furthermore, we may run this application on a smart phone with its back camera activated. The user experiences a pseudo-augmented environment which may be useful to comprehend the hypersphere properties. Both these applications are described in the sections that follow.
- The application’s development methodology
The Hypersphere Perception VR Interactive Application visually shows how higher-dimensional shapes intersect with lower-dimensional space. This method relies on smooth color gradients that change in response to spatial relationships, offering an intuitive way to understand these complex interactions.
The Hypersphere Perception VR Interactive Application utilizes the Three.js web-based platform, which is a powerful tool for the development of VR environments. Three.js, an open-source JavaScript library, facilitates the creation and rendering of animated 3D graphics within web browsers by leveraging the capabilities of the WebGL (Web Graphics Library), which is a JavaScript API for rendering high-performance interactive 3D and 2D graphics within any compatible web browser without the use of plug-ins [8]. Implemented in JavaScript, it ensures cross-platform compatibility, allowing it to function seamlessly across all major web browsers and devices [9]. With its extensive range of 3D objects, advanced animation capabilities, and performance optimization, Three.js is particularly well-suited for the development of immersive and visually sophisticated virtual experiences [10, 11].
We decided to employ the ChatGPT 4 [12] open AI technology to develop the java script software. In particular, the following prompt was given to ChatGPT 4:
“I want a sphere where there are 6 points on its surface, of 6 different colors spread radially upon it correspondingly. I want to have the 6 colors spread out smoothly blended from the 6 points of the surface. I want to control these colors. Give me the code applied to Next.js and Three.js and help me to clarify the key technical details, including the use of fragment materials for rendering in Three.js.”
After some trials and iterations, the ChatGPT 4 produced the required java script code, which is explained, again with the help of ChatGPT 4 [13]. A summary of this explanation is given below for the two distinct cases of interest, namely the a) 3D-to-2D: Sphere and Intersection Circle and b) 4D-to-3D: Dynamic Gradient Effect.
- 3D to 2D: Sphere and intersection circle
The sphere and the intersection circle are colored using a custom shader-based technique, which we created using ChatGPT 4 open AI technologies [13].
Shader: calculates the distance between points on the sphere’s surface and six predefined reference points in 3D space (aligned with the x, y, and z axes). The colors blend dynamically based on these distances, with closer points having a stronger influence on the final color.
Sphere Coloring: A gradient coloring effect is applied across the surface of the sphere, transitioning smoothly between the reference colors. This creates a visually engaging and intuitive representation of how the sphere interacts with a 2D plane.
Intersection Circle: When the sphere intersects with a plane, the circle formed inherits the same gradient-based coloring method. As the sphere moves, the colors on the intersection circle shift, maintaining a smooth gradient effect that reflects the ongoing dimensional interaction.
- 4D to 3D: Dynamic gradient effect
In the 4D-to-3D simulation, the sphere’s surface also adopts a dynamic color gradient effect. The shader assigns colors to each point on the sphere based on its proximity to predefined 3D points. This results in a smooth, real-time gradient that adjusts as the sphere moves or changes its configuration, helping visualize interactions between 3D objects and higher-dimensional spaces.
By blending colors based on distance, this technique makes abstract concepts, such as the interaction between different dimensions, more accessible. The result is a high quality smooth and dynamic visual representation, that adjusts naturally, as the sphere moves, or turns, or scales, allowing users to intuitively grasp the relationship between different dimensions.
- Hypersphere Perception VR Interactive Application’s functional description
The concept of an extra dimension and its theoretical framework is complex, and it is challenging to comprehend. We believe that the Hypersphere Perception VR Interactive Application, should be able to help the user to experiment visualizing one of the hyper dimension structures, namely the hypersphere entering the 3D space.
Figure 4 shows the application’s 'Home' screen menu offering two different spatial simulations, namely, the 3D-to-2D simulation, which examines the well understandable by our brain intersection between a 3D sphere with a 2D plane, and the 4D-to-3D simulation, which is not at all familiar to us, which examines the intersection of a hypersphere with a 3D space.
Figure 4: The Hypersphere Perception VR Interactive Application’s Home menu.
- 3D-to-2D spatial simulation
Upon selecting the 3D-to-2D simulation, the application presents an environment with two distinct planes, each perpendicular to the other, situated in a black 3D space. The graphics on the left display the intersection of a 3D color-coded sphere with a plane; that is a 2D slice. The graphics on the right visualizes the same 2D slice at a 90-degree angle, which shows the current intersection slice colored disk, whose radius and color patterns change dynamically as the sphere is passing through the 2D space (Figure 5).
Figure 5: A 2D cross section of the color-coded sphere.
Regarding the color-coded sphere, it is a solid sphere with its axes xx’, yy’, and zz’ in Red, Green, and Blue, respectively. To achieve this unique quality coloring, an algorithm was developed to color the surface of the sphere pixel-by-pixel, ensuring a smooth transition where different colors blend, adhering the color-mixing rules. Note the quality enhancement between Figures 2 & 3, employing the “monochromatic projectors” method, & Figures 5, 6, 7, 8 & 9, employing the “dynamic color gradient” method.
As the color-coded sphere intersects the 2D plane at various angles, the resulting depiction on the other plane is a circle that grows and shrinks, changing its color in accordance with the variation in its radius.
In the application’s interface, shown in Figure 5, several buttons are available to assist the user. The 'Axes on/off' button (top LHS corner) toggles the axes appearance, aiding with a clearer geometric orientation.
A small menu (top RHS corner) contains three buttons, namely the 'Home' button, directs the user to the main home page, the 'Reset' button, restores the XYZ Rotation values to their default settings, and the Information 'i' button, which offers explanations of the visual elements and the functionality of other controls.
The interface also includes 3 sliders (bottom RHS), allowing users to adjust the angle at which the 3D sphere intersects with the 2D plane. Furthermore, the cross-section buttons (bottom LHS) control the percentage of the sphere’s cross-section as it moves through the plane, enabling users to manipulate the size of the intersection.
- 4D-to-3D spatial simulation
Figure 6 depicts a screenshot of the home page of the 4D-to-3D simulation. The Hypersphere Perception VR Interactive Application initially shows an empty black 3D space. When the play button is clicked (bottom LHS corner), a small colored point emerges, as the color-coded cross-section of the hypersphere begins to enter the 3D space, then the sphere’s size increases to r=50%R, continues to r=100%R (maximum), and finally starts decreasing to a point, as it exits from the 3D space. The user may interact with the hypersphere evolution by clicking the 'Left', 'Right' and 'Play' buttons (bottom LHS corner) and observe its evolution as it moves through the 3D space. Also, by utilizing the buttons 'R', 'G', 'B', & 'W' (bottom RHS) the user may select the axis along which the hypersphere enters the 3D space. For example, Figure 6 shows the selected 'White' axis button. The hypersphere is entering the 3D space R, G, B along the W-axis (W-axis value=zero). That is why there is no white color on the surface of the resulting hypersphere cross-section. The user can interact with the color-coded hypersphere cross-section as it intersects with the 3D space, observing it as it enters, exits, and rotates. Note that, as the hypersphere enters the 3D space, it is colored with the corresponding colors in accordance with the color-coding method [5].
Figure 6a: 50% cross-section of the hypersphere with 3D space, W-axis selected.
Figure 6b: 100% cross-section of the hypersphere with 3D space, W-axis selected.
The application interface includes a Menu on the top RHS corner of 3 buttons, namely the 'Home' button, directs the user to the main home page, the 'Reset' button, restores the hypersphere to its initial state, and the 'Info' button, provides explanation of the application’s functionality.
As the sphere changes size, the color mix emerges in different combinations, indicating the progression of the hypersphere as it enters and exits the 3D space. It is also possible to rotate the hypersphere with the mouse.
Figure 7 shows the 3D cross-section of the hypersphere entering the 3D space. In particular, the color buttons red, green, blue and white define the axis via which the cross-section of the hypersphere enters, as shown by the four screenshots correspondingly.
Figure 7: 3D cross-sections of the hypersphere entering the 3D space along the R, G, B & W axis.
- Mobile phone compatibility
For mobile compatibility, the Hypersphere Perception VR Interactive Application performs well but requires some optimizations to enhance the experience on various devices. It accesses the device's rear camera for a live video background, which is supported on most modern mobile browsers, but depends on user permissions. The 3D controls are touch-compatible, allowing users to rotate and zoom, though fine-tuning can improve the touch response. To ensure smooth performance on mobile, particularly on devices with lower processing power, reducing the complexity of the 3D model and considering that a fallback for the video background can prevent lag. Additionally, scaling adjustments are necessary to handle different screen sizes and orientations, ensuring the layout adapts seamlessly. Finally, given the mobile devices' limited battery life, adding an option to pause intensive background animations could improve efficiency for prolonged use. These optimizations would make the component more user-friendly and accessible on mobile [14].
An important feature for mobile users is the integration of the rear camera as a live video feed, enabling an augmented reality-like experience. This allows the 3D objects to blend with real-world environments captured by the device’s camera. (Figures 8, 9)
Overall, the Hypersphere Perception VR Interactive Application is designed to function smoothly on most modern smartphones, though performance may vary slightly depending on the device’s hardware and browser capabilities.
Figure 8: Screenshot from a mobile, depicting the pseudo-AR experience of the 4D-to-3D cross-section simulation.
Figure 9: Screenshot from a mobile, depicting the pseudo-AR experience of the 4D-to-3D cross-section simulation with the axes showing.
- Future development
In the future, we may improve the Hypersphere Perception VR Interactive Application by increasing the interaction, e.g., by employing hand tracking features and mobile phone AR frame. With such features the user may have a better control of the color-coded hypersphere interaction in all stages of its transit through the 3D space. In addition, the Hypersphere Perception VR Interactive Application should be employed in a controlled experiment, involving an adequate number of users, that should show if their hypersphere cognitive perception can indeed be improved.
- Conclusions
As the fourth spatial dimension is not easily perceived by the human brain, understanding the properties of hyper-solids requires more than just understanding topology. To bridge this gap, we propose merging technology and artistic knowledge to develop tools of extreme aesthetic value. As TechnArtists [15], we believe that by utilizing VR technology, we can create an immersive and interactive experience that helps users approach the cognitive perception of hyperspace.
Our approach is based on the idea that since our brain is trained to comprehend the intersection of a 3D sphere with a 2D plane, it could possibly be trained to comprehend the intersection of the hypersphere with the 3D space; that is, by simulating and displaying the color-coded hypersphere intersection, while retaining the ability to manually control its transit through 3D space.
The Hypersphere Perception VR Interactive Application is based upon a pioneering simulation method, which deals with the perception of the 4D hypersphere. It employs a VR application to provide stimuli to our 2D sensory system creating a visual representation of the intersection of the hypersphere.
In addition, by running the application on a smart mobile, a pseudo-AR application is created, which allows the users to explore and interact with the 4D-to-3D simulation in a way that would not be possible otherwise.
- Acknowledgements
We thank Mr. Triantafyllos Kountardas for his assistance with the development of the application’s s/w code using ChatGPT 4.
- References
- M.M. Herrera, S.J. Ordóñez and S. Ruiz-Loza. Enhancing mathematical education with spatial visualization tools. Volume 9 – (2024). URL:https://doi.org/10.3389/feduc.2024.1229126
- P. Thurston. Three-Dimensional Geometry and Topology. Vol.1. New Jersey: Princeton University Press. (1997).
- I. Varghese, Engineering Graphics, Tata McGraw-Hill Education, New Delhi, India. (2013).
- H. Hinton. The fourth dimension. Swan Sonnenschein & Co; John Lane (1904).
- Traperas and N. G. Kanellopoulos. Visualizing the hypersphere using Hinton’s method. Technoetic Arts: A Journal of Speculative Research 16, No.2, 163–178. (2018). URL:https://doi.org/10.1386/tear.16.2.165_1
- Gounaropoulos, D. Traperas and N. G. Kanellopoulos. 4D Hypersphere perception via a holographic art installation. International Conference on Digital Culture & AudioVisual Challenges, Ionian University, Corfu, Greece, 28–29 May (2021).
- Silseth, R. Steier and H.C. Arnseth. Exploring students’ immersive VR experiences as resources for collaborative meaning making and learning. Intern. J. Comput.-Support. Collab. Learn (2024) 19:11–36 URL:https://doi.org/10.1007/s11412-023-09413-0
- MDN web docs, WebGL: 2D and 3D graphics for the web. URL:https://developer.mozilla.org/en-US/docs/Web/API/WebGL_API
- Borstch, Blog, Development, Building 3D web applications using Three.js. URL:https://borstch.com/blog/building-3d-web-applications-using-threejs
- js, docs examples on 'How-to-create-VR-content'. URL:https://threejs.org/docs/#manual/en/introduction/How-to-create-VR-content
- Distillery, Code Chronicles, Engineering, Perspectives, Dec 4, 2023, Exploring the 3D World on the Web with Three.js. URL:https://distillery.com/blog/exploring-the-3d-world-on-the-web-with-three-js/
- GPT-4 is OpenAI’s most advanced system, producing safer and more useful responses. URL:https://openai.com/index/gpt-4/
- ChatGPT4, Smooth, gradient-like color map across the sphere. URL:https://chatgpt.com/share/67460fb3-79c8-8000-9878-67b0796f4438
- ChatGPT4, Mobile compatibility. URL:https://chatgpt.com/share/67460fe7-65c8-8000-9a54-867252cda2c4
- G. Kanellopoulos, From Antikythera Analogue Computer to Quantum Computer and the TechnArtist: Augmented Reality and Art. 11th Audiovisual Arts Festival, Athens Concert Hall (Megaron), Athens, Greece, 20–21 May (2017). URL:http://dx.doi.org/10.13140/RG.2.2.10730.84166
- Authors information
Charilaos Gounaropoulos
Department of Audio and Visual Arts, Ionian University
e-mail: xarisgoun@hotmail.com
Charilaos Gounaropoulos graduated from the Department of Physics, University of Ioannina. He received his MSc in Humanistic Informatics from the Department of Informatics, Ionian University (2019). He is a PhD candidate in the Department of Audio and Visual Arts, Ionian University.
Nikolaos Grigorios Kanellopoulos, https://orcid.org/0000-0001-5682-6395
Department of Audio and Visual Arts, Ionian University
e-mail: kane@ionio.gr
https://avarts.ionio.gr/en/department/people/30-kanellopoulos/
Professor Emeritus Nikolaos Grigorios Kanellopoulos has served as Vice-President of the Ionian University Council (2013-2017), President of the Audiovisual Arts Department (2007-2012) & (2017-2020), Vice-President for the Computer Science Department (2005-2007) and Faculty Member of the Archives & Library Science Department (2003-2007) of Ionian University, as well as Faculty Member of the Computer Science Engineering & Informatics Department of Patras University (1987-2003), and as President of the Greek National School of Dance (2000-2002). He has experience with more than 50 National and European R&D projects in the fields of Computer Applications. His published work includes 3 international patents and 135 research papers/studies. Currently, his main research interest focuses on hyperspace comprehension with the use of Audiovisual Art Interactive Systems (VR/AR).
Charilaos Gounaropoulos graduated from the Department of Physics, University of Ioannina. He received his MSc in Humanistic Informatics from the Department of Informatics, Ionian University (2019). He is a PhD candidate in the Department of Audio and Visual Arts, Ionian University
Professor Emeritus Nikolaos Grigorios Kanellopoulos has served as Vice-President of the Ionian University Council (2013-2017), President of the Audiovisual Arts Department (2007-2012) & (2017-2020), Vice-President for the Computer Science Department (2005-2007) and Faculty Member of the Archives & Library Science Department (2003-2007) of Ionian University, as Faculty Member for the Computer Science Engineering & Informatics Department of Patras University (1987-2003) and as President of the Greek National School of Dance (2000-2002). He has experience with more than 50 National and European R&D projects in the fields of Computer Applications. His published work includes 3 international patents and 130 research papers/studies. Currently his main research interest focuses on Hyperspace Comprehension and applying digital technology in Audiovisual Art Interactive Systems (VR/AR).
Back