Far North Entertainment is working on a third-person, zombie-horde shooter using DOTS, which has provided great performance improvements. Learn how DOTS processes data ultra fast in two areas of the game. We also provide a concrete example of how you can implement DOTS as part of a traditional Unity application.
Simon EliassonCo-owner & Game Director, Far North Entertainment
Simon was introduced to programming at the age of 13 and has been coding ever since. After concluding his studies in computer science engineering and interaction design, he joined Far North Entertainment as its new game director. Simon takes interest in nearly all aspects of game development. Apart from game design and programming, he works with production planning, VFX, music composition, sound design, and the company's YouTube channel.
Previous Session
How the Universal Render Pipeline unlocks games for you
Learn the benefits of using the Universal Render Pipeline, how to set up your project for URP, and how to e...
Next Session
New 2D graphics features
Join us for an overview of the new 2D graphics features, including 2D Lights, 2D Shader Graph, and 2D Shado...
A single, unified UI editing tool in Unity is our goal for UIElements. Learn about UIElements for runtime, new UI authoring workflows, and how we’re building the tools to benefit artists and creators.
Learn what’s involved in migrating existing game code to the Data-Oriented Technology Stack (DOTS), which comprises the C# Job System, the Entity Component System, and the Burst Compiler.
Introduction to the DOTS Sample and the NetCode that drives it
Join us for a deep dive into the networked future of Unity using DOTS. We share how we made the DOTS Sample a networked game, and what we learned along the way.
The Animation Rigging package provides a library of rig constraints for you to procedurally control skeletal animation at runtime. Learn how to set up rigs for animation authoring and more.
Bringing 2D characters to life with sprite rigging
When animating characters and elements in your game, it’s important to know the benefits and drawbacks for each technique. In this session, learn what’s new with Unity 2D’s rigging.
Join us for an overview of the new 2D graphics features, including 2D Lights, 2D Shader Graph, and 2D Shadows, learn how to improve a 2D level using these tools, and see how they all work together.
This session covers the benefits of asynchronous routines over coroutines. You’ll see how one sample problem – building an asynchronous prompt pop-up – can be solved using async vs coroutines.
Get a high-level overview of the Entity Component System (ECS) and turn-based game loops. This session shows ECS concepts in a slightly exotic context, and also shares some use case pitfalls.
Optimizing and deploying real-time ray traced GI with RTXGI
We review ray tracing, show behind-the-scenes VFX breakdowns for a Unity demo, and share best practices for lighting and modeling when working with NVIDIA RTXGI and accelerated ray tracing.
Learn how the Demo team used Unity features like the High Definition Render Pipeline, Post Processing, Shader Graph, Visual Effect Graph, and Timeline to create the real-time short film, The Heretic.
Discover best practices on how to leverage the Addressable Asset System to simplify your content management, and how Addressables can enable success through your game’s initial release and updates.
The High Definition Render Pipeline (HDRP) helps both indie and AAA studios produce high-quality visuals for PC and console games. Learn how and when to start using it for production.
We provide an overview of the physics systems and workflows powering DOTS. Get insight into design considerations underlying Unity Physics and how its use cases differ from those of Havok Physics.
Quickly generate game character behaviors using AI and ML
Join us for a walk-through of the Unity Behavior Planner, Unity ML-Agents Toolkit, and Unity Inference Engine – three tools that leverage state-of-the-art AI algorithms to help with your next game.