This update fixes a bunch of bugs and introduces a few new features.
You can now create properties inside a library. These properties can then be drag-and-dropped onto object properties to control multiple property changes from a single source property (either through the UI or via script). You can also keyframe or apply signals to the created properties to animate all assigned properties together.
Drag and dropping onto a property:
Physics Simulation Quality
A new property has been added under a Space's Physics property group that enables you to select a physics simulation quality. This allows you to control the number of sub-steps used to solve physics motions and collisions resulting in less error when using higher quality settings. Less error means constraints are more stable and collisions are more precise. Previously Simmetri was set to the equivalent of the "Medium" setting.
You can now select a framerate under the Preferences Panel / Graphics tab. Whereas before Simmetri defaulted to rendering as fast as possible, you can now limit the framerate to help control energy usage and system load. Framerate can either be "Unlimited" or set to a fraction of your screen's refresh rate (V-Sync). Defaults to V-Sync.
Additional Bug Fixes
Updated NDI SDK to 3.8
Fixed issues with Spout not working in certain texture share modes.
Exported app now compatible with Wallpaper Engine (use Player.exe as the wallpaper exe)
Now exports preview and readme files when exporting as apps
Now correctly renders using the aspect ratio for projected spot textures
Fixed sound playback bug that prevented some tutorials from playing their spoken text at times
Fixed issues with CharacterActor being twirled around by rolling spheres
FPS always shown in corner of screen while changing Graphics preferences.
In this update, we focused on making user generated content easier to share as well as updating the content browser with improved access to learning resources.
You can now easily upload your universes to the workshop and get a web-link you can send to other Simmetri users to quickly share your experiences or concepts. This panel is also where you can export your show as an application (EXE).
Learning resources including the interactive tutorials are now listed within the content browser under the tutorials tab. Here you can also access the experiences of the YouTube tutorial videos from our 30 days of Simmetri campaign.
Instead of File/New placing you into a default world with a blue sky and terrain, you will now be presented with a variety of starting points to choose from. We will be adding to the list as time goes on.
Accessing the Explore and Sharing panels
You can now access the Explore and Sharing panels via two new buttons at the top right of the main UI:
Other Bug Fixes
Added a new intro experience
Moved add-ons into a primary explore category
Added web hyperlinks to show descriptions
Added a link to our Discord server on the Explore overlay
Added a link to our YouTube channel on the Explore overlay
Created a Discord bot for the Simmetri discord channel for share-link embeds
Double-clicking a number with a decimal point now selects the whole number
Added 'About this Universe...' to the file menu to view its description
Added a link back to index in documentation pages
Recent Projects menu now filters out missing projects
Join us on our Discord channel to talk back with us. Here's our Discord invite link: Simmetri Discord Server Invite There you can ask us anything about Simmetri and we'll always answer.
Simmetri On The Live Stage
Simmetri was recently used to make a multi-user interactive escape-the-room puzzle that was projected onto a 44' dome. Participants needed to work together to solve the puzzle and save Boston from an alien attack. Here's a photo of the show:
If you're curious on how to do this, here's a support article that shows you how to set this up in Simmetri: Rendering To A Dome
Steam user Oeozero's writes: "The software is very well thought out and makes it easy to quickly create anything I can imagine. The ability to not only build objects but change the physics of the VR universe makes this software virtually endless in what can be created. " Full Steam Review
Thanks so much Oeozero for the video review, you took it to the next level: https://youtu.be/99e7OvA9V9s Steam user PublicVRLab writes: "As the lead creative designer at the Public VR Lab, I'm in a constant search for the next VR creation tool. Playing with Simmetri for the first time was a dream come true." Full Steam Review
Steam user Pinckney Benedict writes: "I'm a professor in the creative writing program of a large public university. I've long been on the hunt for a tool that would allow me and my students--almost none of us STEM types or coders--to make narrative art in VR (and in 2D). Simmetri is that tool." Full Steam Review
Steam user ALF writes: "Simmetri interactive tutorials are probably the best i've ever walked through!" Full Steam Review
In this update, we wanted to highlight some awesome user created content. You can find all of these experiences under the Featured section of the content explorer in Simmetri.
by Pinckney Benedict
This experience depicts a highly atmospheric journey. From the author:
A VR setting of the Henry Wadsworth Longfellow poem "Haunted Houses." Read by Professor Michael Humphries. Created for the Exterior/Interior design exposition at Southern Illinois University Carbondale.
This experience shows a detailed solar system simulation. From the author:
A model of the solar system meant for conveying an overview of the Sun, planets, and asteroid belts. The rotations are roughly accurate, but due to space considerations, the orbits are not accurate. They are modeled as simple circles rather than the complex ellipsoids we actually observe. Also the orbit periods are not accurate for the same reasons. But to introduce the Solar System and its relationships, this can be a good place to start.
For this update, we focused on audio by adding support for external audio sources (microphones, line-in devices), generating audio from signals and implementing better audio visualization features for connecting audio data to properties in your universe.
Sampling External Audio Sources
A big feature introduced in this update is the ability to sample external recording devices, such as a microphone or line-in input, and bring in that audio as either a Sound in the space or for visualization purposes only. To do this, there's a new audio input object called the InputAudioStream. Once created in your library, you can drag-and-drop it into the space to create a Speaker that plays the audio or you can set it as the Source of an AudioSignal (like an AudioPeakSignal). You can bind an InputAudioStream to a specific recording device via the Edit Menu/Preferences/Inputs panel.
Another new feature is the ability to generate audio via Signals. For example, you can generate tones via a WaveformSignal. To do this, a new audio input called SignalAudioStream has been added. Once you create a SignalAudioStream, you can set a Signal in its properties then drag and drop it into the space to create a Speaker that outputs the signal's generated audio. Try different signals and wavelengths along with Reverb on the Speaker for interesting results.
A new signal and pulse have been introduced to better sample audio for modulating parameters in the space. The AudioPeakSignal will generate a signal from the audio's peak data with a smooth drop-off. By connecting this signal to various properties in the space, you can easily pulse values to peaks in the audio. The AudioPeakPulse is similar to the signal, but can be used wherever pulses are required.
Another big change made is the elimination of AudioStreams when importing audio data from files. Before, if you wanted to import a long audio file (e.g. a song), Simmetri advised you to import it as an AudioStream to ensure the data would efficiently be streamed. However, by doing so, you were restricted to the AudioStream's timeline syncing behavior when playing the audio in the space through a Speaker which might not always have been desirable.
Now, all audio is imported as AudioClips and how and when those AudioClips are played is determined through their Sound nodes. A Sound node now has an "Auto Play" property that can be used to set when (or if) a Sound is automatically played. One of the auto play options is "On Timeline", which emulates the prior functionality of an AudioStream. Other auto play modes introduced include "On Copy" (for node emitters that emit sounds), "On Proximity" (to play when the camera gets close to a sound) and other modes as well.
Note that now (unlike prior versions) a Sound node (created by drag and dropping an AudioClip into the space) will not play by default. An auto-play mode must be chosen to do so or the Sound needs to be played via script.
Other Additions And Bug Fixes
AudioSignals (like the AudioPeakSignal or AudioFreqencySignal) can now sample either the output mix or a particular AudioStream
AudioWaveformSignal and AudioFrequencySignal can now be sampled through space coordinates to build oscilliscopes and spectrum analyzers out of objects/nodes
Increased number of waveform samples for the AudioWaveformSignal
Sound looping property moved into AudioClip
Sound TransposePitch property now continuous
Fixed Reverb Decay Time not working
Added a new interactive tutorial for changing the GUI text size
Particle effects now inherit velocities of parent RigidBodies
A tooltip now appears when dragging a timeline event displaying its current time
Fixed bug where Add To Library was not updating the browser correctly
For this update, we’ve added a new tool for live performing: BPM (beats-per-minute) controls.
You can now add BPM controls to your show to create a panel you can access at play time to control the BPM rate dynamically. When you tap a BPM rate using the controls, you effectively set the frequency of a special BPM Pulse that lives in your Universe that you can assign to properties and emitters to sync them to the beat. You can do this by dragging-and-dropping the BPM Pulse onto any numerical property.
You can find the ‘bpm controls’ tool under ‘live tools’ in the toolbox.
When you add the controls to the space, the controls will be created in an overlay ViewPort that is active regardless of what ViewPort you are in. Think of it as your control surface allowing you to interact with controls that only the performer sees in the case where you create a separate Display representing your output to the audience.
Since these controls live in your universe, you are free to change and edit them as you like. Additionally, since they are added as objects to your Universe, they will be included as part of your show when exporting it as an application.
Here’s an article that describes how to create and use the bpm controls: BPM Controls
Other Additions And Bug Fixes
Fixed a bug where objects emitted on move by a rotation or scale transform widget, stayed in the space on stop
Made scrubbers easier to scrub by increasing their vertical hit area
Grabbable interactions now require objects with RigidBodies
Made Grip compatible with sounds, emitters and decal emitters
Made auto mode state more clear on bpm controls
Camera monitor tool now creates monitor for existing picked camera
Since launch, we’ve been busy listening to your feedback. For this first update pass, we focused on continuing to build out support for the live artist/VJ community. Here’s a rundown of the changes:
Rendering To Outside Applications
One of the larger changes made in this update is the decoupling of inputs/outputs (think multi-monitor output, Spout sharing, etc) from your Universe into the application Preferences panel. Here’s a rundown for why this change was made:
Before, when you wanted to output an in-world camera or screen to an external application or monitor, you needed to create a Display for the camera then, under the Display’s properties, add appropriate Display Outputs to send the Display’s rendered texture to external displays/applications. Adding them directly to the Display in this manner meant that the outputs were ‘baked’ into your show meaning they would need to exist exactly as defined even if opening the show on a different configuration/computer. Also, if you exported your show as an application, the application would require those same exact outputs to function properly, which is no good if you take your show on the road to venues with different display configurations, etc.
To address this problem, the Display outputs have been moved into Edit/Preferences/outputs meaning they are configured at play-time (in your exported app as well) under Preferences. You still need to create Displays in the Universe as these represent your Universe’s outputs. However, now you (or your end-users) bind and setup the actual output devices under Edit/Preferences.
Similarly, like outputs, input texture bindings have also been decoupled from the Universe. Now, when you want your universe to receive shared texture data from an external app, you can create Input2DTextures (formerly Spout2DTextures) in your Universe and bind them in the ‘inputs’ tab under Edit menu/Preferences. If no input is explicitly defined, the Input2DTexture will connect to the first input it detects.
NewTek NDI Support
NDI support has now been added for both Inputs and Outputs meaning you can now share and receive NDI based video streams. Especially useful when sharing rendered output over networks.
The grip tool was added for cases where you, the performer, live manipulate a projected show/presentation to the audience while you direct things immersed in VR. To use it, you click a camera (or light/sound/emitter) in the space to add a grip object that can then be used at playtime to grab and move the camera using your VR hand laser.
A new property controller has been added that allows us to alter the value of any numerical or boolean property when a pulse emits an event. You can have the value pulsed or shifted using a variety of animation curves.
Drag + Drop Automatic Pulse/Signal Controller Assignment
If we create a Pulse or Signal in our browser library, we can quickly assign it to modulate any numerical or boolean property by dragging and dropping the signal or pulse from the browser onto a property. After the drag-drop, there will be a new set of options listed under the property to control how the value is changed.
Located under the ‘ui \ game’ tools of the toolbox, these allow you to quickly add a spin or slide animation to a selected object. These are cumulative as well, so you can add a bunch to an object for interesting results. These tools effectively automate adding signals to an object's transform. If the object has a movable rigid body, these tools will create motorized physical constraints instead so the spinning/sliding object will correctly interact with other movable physical objects (i.e. with RigidBodies).
Other Additions And Bug Fixes
Simmetri should now not close if you shut SteamVR down while in VR mode
Fixed issues preventing Simmetri from rendering through the GameBar to Mixer for video and broadcasting
Fixed flickering issues that sometimes could occur when using multiple screen display outputs
All Displays now have a built in DisplayTexture for quickly accessing the Display’s texture to apply to in-world materials.
Screen aspect ratio is now defined under the DisplayTexture’s properties
All texture thumbnails can now be dragged and dropped into the space to create a plane rendering the texture
Dome output now will always output as square aspect ratio without needing to explicitly set it
The script editor window will now hide itself when the Steam Overlay is shown
Ctrl+Tab now out-dents selected code (instead of Shift-Tab, which conflicted before with the default Steam Overlay shortcut)
Screen preferences (under the Preferences panel) have been now folded into the ‘graphics’ and ‘outputs’ preferences tabs
Fixed edit boxes from altering the value (in some cases) as you typed in them
Can now directly attach attributes to transform/vector components without having to add a splitter attribute
Fixed a bug where changing a default binding of a PlayerController would reset on stop
Added a menu item and keyboard shortcut (Ctrl+L) to lock/unlock selected objects
Can now add PlayerControllers directly to Spaces for defining global controls
Fixed issue where dragging and dropping a signal into the space was not setting itself as the rendering signal on the created SignalVisualizer
Input2DTexture thumbnails in the browser now update in real-time. Before they would only update when the browser panel was refreshed.
Now each ViewPort will have its own separately active PlayerNode if actively rendered through a display.
First Early Release Update: Ask and You Shall Receive.
We've been hard at work to deliver a whole pack of tutorials, improvements and features you've all requested. Tutorials and news below.
Here's what to look forward to in an upcoming Simmetri update-- June 11:
Immersive VR-VJ tools
Deep organizational improvement for your display inputs / outputs.
Beat controls (live BPM controls) and a system to attach pulses to any property.
NewTek NDI Input and output support (for network texutre sharing)
Pre-loaded drag&drop animations.
Direct Record and Livestream with Windows Gamebar.
Creator Spotlight: Jesse James >> Simmetri + HeavyM Projection Mapping
Brilliantly elegant first piece from artist Jesse James; combining Simmetri with HeavyM for projection mapping and realtime animation. We'll have to ask him to teach us what he did. Excited to see what he does next. Check Jesse's Piece: Youtube video here.
These shows are set up to demo what's possible when you combinine skills from the new tutorials in the above list: onMove, Emit when moved, emit (unit+timeline) Animation, Signal animation. These were created in 10 minutes and yet show a lot of potential. Let us know what you think of this could take this effect to the next level in the Creative Ideas discussions forum.
Different than "How Do I?" and "Bug Reports" and "Feature Requests"... If you're encountering technical difficulties with Simmetri or the ecosystem of plugins, inputs, outputs, hardware etc. Let us know. We aim to be extremely responsive to any activity in any forum... so please post, we'll try to get your issue solved and get you back to being creative. Technical Support Forum Link Here
As you post in "Technical Support":
Provide as much detail about your computer, OS, additional software (in the case of integrations) and harware specs. Everything but your Steam Password ;)
Detail out the steps that you take using exact language from the Simmetri UI if possible.
If you encounter issues with importing 3D Models we want to know. We may ask you for a download link to the model. 3D models are extremely un-standardized and while we cover a good majority pretty well, there's always weird ones out there. So let us know which your working with and we want help you succeed in getting your particular model into Simmetri.