this month, we have prepared a new update for you with a number of new features and bug fixes that we hope you'll enjoy!
Physical hands Your hands now collide with the environment and will no longer pass through walls. You can also tip over bottles and other objects. A short controller rumble will indicate when your hands make contact with something.
Please note: due to technical and gameplay reasons, the hands do not inflict damage on objects directly, and they do not collide with NPCs or enemies.
This feature can be disabled in the gameplay options if you prefer the hands the way they were before.
ProTubeVR support Do yo own a ProVolver or ForceTube device from ProTubeVR? Then this update enables native support for them - enjoy your guns' recoil on a new level!
dxvk update and stutter fixes HL2VR uses dxvk to translate the old D3D9 API (which is not supported by VR runtimes) to Vulkan. We have updated our version of dxvk to 1.10.3. In the process, we also discovered that the async patch we applied to avoid dreaded pipeline compilation stutters was too conservative in its decision when pipelines could be compiled in the background. As a consequence, it could still happen that the game's rendering would occasionally freeze for a short moment while waiting for graphics pipelines to be ready. This wasn't really an issue on NVIDIA cards, but it did affect AMD cards more often. We have adjusted async compilation to be used more aggressively, and these situations should no longer occur.
So, if you've had issues with the game's rendering occiasionally stuttering or freezing during normal gameplay that didn't coincide with a level transition and could be observed in e.g. fpsVR as spikes in the CPU time, this update should hopefully resolve the issue.
Miscellaneous
Fixed the SMG firing animation sometimes playing when in the weapon selection menu
Added controller rumble effects when ejecting a magazine, grabbing a mag from the sholder, inserting a mag or grabbing a weapon with your second hand
Can you hear the bells ringing yet? Can you smell the spruce scent in the street or at home? If not, our new DLC is for you! AURA: Hentai Cards - Jelly Christmas DLC is already waiting on Steam.
❄️ "Jelly Christmas” DLC
San Helm, a small town on the map, becomes a real trap for Heyla and for our MC. They are in real danger of failing a SPECIAL mission from the King! You'll have to work hard to complete it and get the necessary item. Meet new charming monster girls, new maps, and passionate H-scenes that can melt even the ice.
🌐 Patch v1.2
We're growing far and wide! Our game AURA: Hentai Cards has been translated into 5 new languages: French, Dutch, Polish, Spanish and Portuguese (Brazilian). This means that even more people will be able to enjoy the game in their native language and better understand the story and tasks. The San Helm region has also been added to the map. Minor improvements have been made to the code.
🎁 Benefits
In Steam you can also see additional materials for the game, which will brighten the waiting for the new DLC. We are happy to announce that wallpaper pack, soundtrack and artbook are also updated! And those who buy Jelly Christmas will initially get new desktop wallpapers (look for them in the game folder).
💸 Discounts!
The add-on has just been released, and you can already get a good deal on it! From 8th to 18th December there will be a 20% discount on the main game (31% for DIVINE EDITION) and additional materials and a 10% discount on Jelly Christmas! If you didn't buy it in time, hurry up for a favourable offer 😉
Get our DLC, made with love! We will be very grateful if you support our games, and we’re over the moon with the reviews on Steam. Send it to your friends whose languages now appear in the game, take care of your neighbour ❤️🔥
With love, TopHouse Family 😘
p.s. wait in the next year - REVENGE OF THE FOX SPIRIT DLC
this month, we have prepared a new update for you with a number of new features and bug fixes that we hope you'll enjoy!
Physical hands Your hands now collide with the environment and will no longer pass through walls. You can also tip over bottles and other objects. A short controller rumble will indicate when your hands make contact with something.
Please note: due to technical and gameplay reasons, the hands do not inflict damage on objects directly, and they do not collide with NPCs or enemies.
This feature can be disabled in the gameplay options if you prefer the hands the way they were before.
ProTubeVR support Do yo own a ProVolver or ForceTube device from ProTubeVR? Then this update enables native support for them - enjoy your guns' recoil on a new level!
dxvk update and stutter fixes HL2VR uses dxvk to translate the old D3D9 API (which is not supported by VR runtimes) to Vulkan. We have updated our version of dxvk to 1.10.3. In the process, we also discovered that the async patch we applied to avoid dreaded pipeline compilation stutters was too conservative in its decision when pipelines could be compiled in the background. As a consequence, it could still happen that the game's rendering would occasionally freeze for a short moment while waiting for graphics pipelines to be ready. This wasn't really an issue on NVIDIA cards, but it did affect AMD cards more often. We have adjusted async compilation to be used more aggressively, and these situations should no longer occur.
So, if you've had issues with the game's rendering occiasionally stuttering or freezing during normal gameplay that didn't coincide with a level transition and could be observed in e.g. fpsVR as spikes in the CPU time, this update should hopefully resolve the issue.
Miscellaneous
Fixed the SMG firing animation sometimes playing when in the weapon selection menu
Added controller rumble effects when ejecting a magazine, grabbing a mag from the sholder, inserting a mag or grabbing a weapon with your second hand
this month, we have prepared a new update for you with a number of new features and bug fixes that we hope you'll enjoy!
Physical hands Your hands now collide with the environment and will no longer pass through walls. You can also tip over bottles and other objects. A short controller rumble will indicate when your hands make contact with something.
Please note: due to technical and gameplay reasons, the hands do not inflict damage on objects directly, and they do not collide with NPCs or enemies.
This feature can be disabled in the gameplay options if you prefer the hands the way they were before.
ProTubeVR support Do yo own a ProVolver or ForceTube device from ProTubeVR? Then this update enables native support for them - enjoy your guns' recoil on a new level!
dxvk update and stutter fixes HL2VR uses dxvk to translate the old D3D9 API (which is not supported by VR runtimes) to Vulkan. We have updated our version of dxvk to 1.10.3. In the process, we also discovered that the async patch we applied to avoid dreaded pipeline compilation stutters was too conservative in its decision when pipelines could be compiled in the background. As a consequence, it could still happen that the game's rendering would occasionally freeze for a short moment while waiting for graphics pipelines to be ready. This wasn't really an issue on NVIDIA cards, but it did affect AMD cards more often. We have adjusted async compilation to be used more aggressively, and these situations should no longer occur.
So, if you've had issues with the game's rendering occiasionally stuttering or freezing during normal gameplay that didn't coincide with a level transition and could be observed in e.g. fpsVR as spikes in the CPU time, this update should hopefully resolve the issue.
Miscellaneous
Fixed the SMG firing animation sometimes playing when in the weapon selection menu
Added controller rumble effects when ejecting a magazine, grabbing a mag from the sholder, inserting a mag or grabbing a weapon with your second hand
Start your rule now! Save 20% off Great Houses of Calderia from December 8 to 11.
20% Discount - Weekend sale
Calderia calls for new rulers, settle your House and start your legacy! Enjoy a special 20% discount on Great Houses of Calderia this weekend, from today to December 11. Grab your copy now and enjoy the latest upgrades and features!
The Emperor entrusted you with great responsibility, answer his call and settle your House in Calderia! Noble Houses stand as pillars of power, each requiring a leader to make crucial decisions. Whether it's arranging alliances, waging war, scheming, trading, or determining resource allocations, your objective is to expand your power and rise in the ranks of Calderia.
Your support makes our world go round! Share your feedback and suggestions with us! Your contribution plays a major role in the game's development. Thanks for helping us create the best version possible!
As we approach the final stages of our Early Access journey, with Update 4 just around the corner, we have some news to share. As many of you already know, Inkulinati already includes a local multiplayer mode that makes it possible to play together with your friends while they are at your place - we have tried during the recent development process to integrate this function on an online-level - but unfortunately Inkulinati will not feature an online multiplayer mode upon its 1.0 release. This decision wasn't made lightly, and despite our best efforts to enable you to find your opponents throughout the internet, it turned out to be impossible for us and the scenario we are in. Let us provide you with some context on why this decision was made.
Initially, when we planned for online multiplayer, the game was smaller in scale, making it easier to add extra features. However, as we expanded the game with more content like new Beasts, Masters, Battlefields, Academy classes, and more, and also worked on porting the game to “various” platforms (more news about that coming here soon), adding an online multiplayer component across all these platforms turned out to be a too heavy task to burden while at the same time preparing everything for the 1.0 release. It was an ambitious goal to begin with, but at this point in development it turned out to be too risky for the game's functionality.
Essentially, we faced a choice between enhancing the existing offline experience (which we wanted to make even better and packed with a lot more content) or creating multiplayer at all costs, which might end up being very simple and not satisfying for players who are used to other multiplayer 2D strategies. The challenges of implementing online multiplayer at this stage could have potentially compromised the overall gaming experience. In other words – given our very small team – we had to make a difficult decision.
It wasn’t an easy decision, but at least for now, we decided to focus solely on the offline experience and polish it as much as possible before the 1.0 release. Our team is currently focused on creating new content, enhancing existing features based on your valuable feedback, and ensuring a successful launch on additional platforms. We want to maintain the high standard reflected in the current 89% User Rating on Steam.
While we won't have an online multiplayer mode, there is positive news to share though: Local multiplayer remains intact and will receive improvements, including new Beasts, Battlefields, and more. You can still enjoy playing online against your friends using the Steam Remote Play feature as a workaround. Additionally, new Inkulinati content, featuring a brand new water army, is on the horizon. The console versions are making great progress, and we continue to refine and expand the game. And there are even more of yet unannounced Inkulinati things that we will announce soon.
We appreciate your understanding of this situation. We know that some of you – especially those who already bought the game and counted on some multiplayer experiences in the future – wanted to hear some different news today. However, we are working together with our publisher to find a way to somehow make it up to you. Once we know more, we will let you know.
As always, your feedback is important to us, and we remain at your disposal to discuss any part of this. Depending on the future trajectory and the game's success, we may revisit the idea of an online multiplayer feature post-release, but we can’t and don’t want to make any promises about the future.
We hope we can find some understanding in you. Stay tuned for the upcoming Inkulinati "Tactical Swimming" Major Update!
Following our recent announcement of the name change from “Parallel Lab” to “Parallel Experiment”, we’re excited to let you delve deeper into the journey of our game’s development. That’s why we’re launching a series of devlogs to take you along on our adventure, showcasing our progress as we approach the release.
In today’s entry, we’ll explore the reasons behind the title change, recall on how Parallel Experiment began, where it stands now, and our vision for its future. Stay tuned as we unveil the evolution of ideas that are shaping Parallel Experiment into an experience unlike any before.
Knock! Knock! A small sneak-peak into one of our new puzzles!
From Success to Evolution
The journey of Parallel Experiment began right after the success of our first full-blown Steam game, Unboxing the Cryptic Killer. Initially, the plan was to remake of our web game, Parallel Lab, just as we did with “Unboxing”. However, as we dug deeper into the project, we realized that our aspirations had grown. The more we learned about game development, the more we understood what players sought in games like ours. A simple remake felt like we stood in place, but we want every next of our games to be something more.
Walkie-Talkies are back!
This idea sparked a new ambition in us. We didn’t just want to make a game that would be “good”; we aimed to push the boundaries of our creativity and technical skills. Our vision expanded, and with it, the scope of what Parallel Experiment could be. It was no longer about remaking something old but bringing something new and extraordinary to the table – a game that could stand as evidence to our growth and passion as developers
And that’s exactly why we decided to change the name. We wanted to stay close to our initial idea for the game, but we also needed to show that major change that was happening with the game itself, but also with our mindset.
Making Our Vision Real
As we stand today, Parallel Experiment is a vibrant, living project, pulsating with potential. The decision to create a demo for Steam Next Fest was pivotal. It’s not just about showcasing our game; it’s about testing our newly formed vision in practice. This demo process was revolutionary for us. It challenged us to integrate a more engaging narrative, design puzzles that were both fun and thought-provoking, and explore new avenues for player interaction. More than that, we aimed to make all these pieces fit together.
To do that, we tried something new. We had our first get-together. As a remote team, meeting in person was an extraordinary experience, especially for discussing the major aspects of Parallel Experiment. We also playtested the existing puzzles to see if any changes were needed. This session helped us finalize the details of the demo puzzles and create an interesting and mysterious story outline.
Brainstorming hard! A photo from our get-together.
The result? We now have a prototype of a demo that’s more than just a slice of gameplay; it’s a fully realized segment of our larger vision, surpassing Unsolved Case in both scope and depth. This prototype is a proof of concept, showing us – and soon, the world – what “Parallel Experiment” is truly capable of. It’s a small part of a much larger picture, but one that we’re immensely proud of and excited to share.
Imagining the Future
Our vision for Parallel Experiment is crystal clear: to redefine what an escape room game can be. We’re not just creating a game; we’re crafting an experience. Our aim is to weave together a tapestry of intricate puzzles, a storyline that keeps players on the edge of their seats, and gameplay that encourages teamwork in new and innovative ways. We want “Parallel Experiment” to be a game that stays with players long after they’ve turned off their screens.
We envision a world rich in detail and narrative depth, where every puzzle is a step deeper into the heart of a mystery. In Parallel Experiment we are bringing everything to the next level - adventure, suspense, and collaborative discovery. We’re not just aiming to meet the standards set by our previous games; we’re aiming to surpass them, providing an unforgettable experience that sets a new benchmark for escape room games!
That’s all from us for today! Hop onto our Discord if you’d like to ask any questions, or simply let us know what you think about devlogs like this one. And make sure to wishlist Parallel Experiment on Steam!
Hello Kerbonauts. I'm Ghassen, also known as 'Blackrack,' the newest graphics programmer on the team. You have no doubt noticed that we have improved the atmosphere rendering in v0.1.5.0. Today I’m going to share with you some insights into those improvements, as well as some of the improvements that are going to be in v0.2.0.0.
Inspecting the Atmosphere
This is how our atmosphere appeared in v0.1.4.0 on Kerbin:
We can see a very nice-looking sky. However, the effect is very subdued on the terrain, we have trouble reading the terrain topography: It is difficult to tell what we are looking at in the distance and the sense of scale escapes us. Are those mountains? Are those hills?
Cut to v0.1.5.0, we can immediately see a big improvement in the scene's readability.
We can now immediately get a sense of how far away things are and we get a better sense of scale. This is what’s known as aerial perspective.
How the Atmosphere is Rendered
We are using a precomputed atmospheric scattering method which is standard nowadays in computer graphics, and popularized by Eric Bruneton.
It is precomputed, meaning all the heavy calculations involved in simulating how light scatters through the atmosphere are done once for all possible altitudes and sun angles, and then stored in compact and easy to access tables. The latitude and longitude of the observer on the planet does not matter because we can use symmetries and effectively just change the altitude and sun angles to get the scattering at any viewpoint.
These tables can then be used to display the effect in a very performance-friendly manner when the game is running. These are known as look-up tables. This what some of the slices in our look up tables look like:
How Aerial Perspective is Rendered
The look-up tables I’ve described earlier can be used to find the colour of the sky for any given viewpoint inside or outside the atmosphere, as well as how much the atmosphere occludes celestial objects behind it (this is known as transmittance or also extinction, it describes how much of the original object’s light is transmitted and makes it to the observer).
The look-up tables only allow us to get the light scattered towards us from the edge of the atmosphere, and assume we are always looking towards the edge of the atmosphere, so we cannot use it to directly to get the colour of the atmosphere up to an object. This is because the look-up tables would otherwise become impractically big and would eat up our memory budget.
However, since the look-up tables allow us to get the colour of the sky from any viewpoint, we can re-express the scattered light up to a point/object as the difference between two samples to the edge of the atmosphere, starting from different positions.
We also must apply transmittance to the observer to second sample (in red on the diagram) for everything to be correct.
Putting It In-Game
So now that we know the method to render aerial perspective, we can plug it in-game, and see what we get. Behold:
Hmm that looks really strange around the horizon, so what's happening here?
Recall that we are using look-up tables, these are loaded on the graphics card as textures, and they have limited resolution and precision (bit depth). The aerial perspective method described earlier only makes precision issues worse by taking the difference between 2 samples, especially on high variance areas (typically around the horizon) where any imprecisions are amplified.
The way to deal with this is to first inspect the look-up tables, see if anything is stored in low precision textures or with any lossy compression, and use high precision instead where needed typically (16-bit and 32-bit per channel floating point textures).
After that, we can then change the parametrization for how samples are distributed across the look-up table to maximize resolution where it is needed. The original paper offers a nice way to distribute samples, but we found that it works best for physical settings matching those of Earth, but not for some of the settings used at Kerbal scale.
Finally, we review all the lossy transformations in the math and try to minimize any loss of precision and guard against various edge cases.
This is where most of the engineering effort in implementing precomputed atmospheric scattering is spent. Right now, we have gotten our implementation to a good place, however the inherent limitations of the method means that in the future we will move to a different, non-precomputed method which doesn’t suffer from these issues and would allow us greater flexibility.
The Importance of Mie Scattering
We simulate Rayleigh scattering (air particles), mie scattering (water droplets and aerosols) and ozone absorption, each of these is important to represent a different effect and render all the kinds of atmospheres we want.
Mie scattering has a particularly noticeable effect and can be used to make atmospheres look foggy and cinematic, all the while keeping a realistic look. I took these screenshots early in testing the atmosphere changes to illustrate the difference increasing mie scattering makes to a scene:
In the end we went with a relatively subdued setting on Kerbin and a nice heavy setting on Laythe to set them apart, also as a reward for flying to Laythe.
Atmosphere as Lighting
Recall that we have the transmittance that we discussed earlier as the part of light that reaches the observer and objects in the atmosphere. We can now use that to light objects, by applying it to sunlight, this gives us the very nice and soft lighting you can see around sunsets and sunrises:
We can also use the transmittance on the clouds, notice how areas in direct light can get a nice reddish color, while areas not in direct light get ambient light, and we get a very nice contrast between the reddish transmittance and the faint bluish ambient:
Using the atmosphere to do lighting also simplifies artists workflow, as the alternative was to try and approximate the different lighting parameters at different times of the day via various settings and it was very difficult to make the clouds look “right” at every time of day. Now we have less work to do and it looks better and more coherent.
Speaking of clouds, next we will discuss of some of the performance improvements coming in v0.2.0.0, but first let’s see how clouds are rendered in more detail.
How Clouds are Rendered
Modern clouds are rendered via raymarching, a technique that involves “walking” through a 3D volume, incrementally sampling properties like density and colour as we move along, and performing lighting calculations. This method provides a more accurate and visually appealing result compared to traditional rendering techniques and is very well adapted to rendering transparencies and volumetric effects. This figure shows in red all the samples we have to do for a single ray/pixel on-screen:
Because of the number of samples we must take during the raymarching process, it is very demanding performance-wise. A solution to this it to render at low resolutions and upscale.
Temporal Upscaling
Temporal upscaling was introduced in v0.1.5.0, the idea is to render a different subset of the pixels every frame. This is similar to checkerboard rendering if you’re familiar with the concept but generalized and not locked to half resolution rendering. This diagram shows how 4x temporal upscaling works, a full resolution image is reconstructed over 4 frames:
In movement, the old pixels are moved to where they should be on the current frame, based on their position in space and how much the camera moved from the last frame, this is called reprojection.
After moving the old pixels, their colour is validated against neighbouring new pixels, to minimize temporal artifacts, this is called neighbourhood clipping and is the foundation of modern temporal techniques like TAA.
Despite the neighbourhood clipping, we were still getting artifacts and issues after this stage in motion, due to the high number of “old” pixels compared to “new” pixels, typically this manifests itself as smearing or flickering. Our solution was to re-render the problematic areas separately at normal resolution, since these areas are only a small part of the final image.
This sounds great in theory, but while flying around clouds in a fairly heavy scenario we can see the following timings on a 2080 super at 1440p:
Low-resolution rendering of new pixels: 5.45 ms
Reproject old pixels and assemble full resolution image: 0.12 ms
Re-rendering of problem areas: 4.11 ms
Process and add clouds to the rest of the image: 0.09 ms
For perspective, if we want to reach 60 fps we need to render in ~16.6 ms, so this step seems to take a sizable chunk of rendering time in v0.1.5.0, even though we are rendering faster than we did in v0.1.4.0 using this approach.
This is because those re-rendered areas are at the edges of clouds where rays must travel furthest and evaluate the most samples before becoming opaque or reaching the boundary of the layer.
For v0.2.0.0 we took a bit more inspiration from temporal techniques to find an alternative solution to re-rendering problem areas: If colour-based neighbourhood clipping isn’t sufficient, we can use depth and motion information like speed and direction of movement (on-screen) to try and identify when reprojected pixels don’t belong to the same cloud surface/area and invalidate them as needed. The idea is to store all this information from the previous frame, and every frame we do a comparison with the previous one to get a probability that a reprojected pixel/colour does not belong to the same surface we are currently rendering.
After some implementation and tweaking this ended up working well and we can see the following improvement in rendering performance (screenshots taken on a 2080 super at 1440p, framerate counter in top left):
On the launchpad we went from 77 to 91 fps
In-flight around the cloud layer, we went from 54 to 71 fps
That’s about a 17-31% performance improvement on the whole frame and we save 2 ms to 4 ms on the rendering of the clouds.
You can look forward to these performance improvements and more in v0.2.0.0, out on December 19th!