🌟 Sneak Peek Friday: Recap of Our Team Event and Future Developments! 🚀 This week has been nothing short of incredible for the Persival team. After welcoming our latest team member on board, we want to use today's Sneak Peek Fridays series to recap our fantastic team event and share our plans for the future. Our team event was a perfect blend of collaboration, fun, and technical innovation: 🤝 Shaping Team Values and Collaboration: We took time to further shape our team values and enhance our collaboration. 🎉 Socializing Event and Dinner: We had a lot of fun with our socializing event and dinner, enjoying great food, laughter, and camaraderie. 🔍 Technical Deep Dive: On the second day, we had an intense and productive technical deep dive session. We discussed how we want to further develop material handling, surface normals of the geometry, and OpenMATERIAL topics for our lidar and radar sensor simulation. These discussions are crucial for the continued advancement of our cutting-edge Sensor Model Development Library (SMDL). Our commitment to innovation and excellence remains steadfast as we push the boundaries of perception sensor simulation. We are excited about the future and the impactful developments ahead. 🚀 Stay tuned for more insights and updates every Friday as we continue to lead in sensor simulation and model validation! #SneakPeekFridays #TeamEvent #Collaboration #Innovation #PersivalGmbH #Perception #Sensor #Simulation and #Model #Validation #Lidar #Radar #OpenMATERIAL #SMDL
Persival GmbH’s Post
More Relevant Posts
-
It can't be emphasized enough: Standardization initiatives such as ASAM OSI and ASAM OpenMATERIAL are key to making perception sensor simulation the development and testing tool it needs to be in the near future and we currently see massive breakthrougs🚀. We already need automated vehicles for e.g. public transport, logistics and agriculture and as Egon Wiedekind always says: The Game Changer Simulation will get us there!
🚗 Sneak Peek Friday: Wheel-Turning Micro-Doppler Magic! 🌐🔄 In this edition of Sneak Peek Friday, we follow-up on the micro-Doppler effect simulation as we unveil the dynamic simulation of wheel turning in our co-simulation. The intricate details of wheel rotations, combined with the calculated micro-Doppler effect via ray tracing, bring a new level of authenticity to our perception sensor simulation. This advancement is made possible by utilizing standardized interface, like Open Simulation Interface (OSI) and OpenMATERIAL by ASAM e.V. - Association for Standardization of Automation and Measuring Systems. 🌟 Why It Matters: This breakthrough extends our commitment to realism, capturing the subtle dynamics of vehicle components for perception sensor simulation, especially for credible FMCW lidar and radar sensor modeling. 📽️ See the Micro-Doppler Magic Unfold! Curious to witness the magic? Check out the video below, where we explore the intricacies of our simulations, unraveling the captivating dance of wheels and the micro-Doppler effect. In the video below, watch as wheels turn dynamically while brakes remain stationary. Additionally, it showcases the power of ASAM OSI in combination with ASAM OpenMATERIAL that enable standardized movement of wheels and brakes. 🌐 Advancing Realism in Sensor Simulation: Stay tuned for more insights, innovations, and sneak peeks into the world of simulation every Friday. We're shaping the future of perception sensor simulation, one revolution at a time! #SneakPeekFridays #SensorSimulation #MicroDopplerEffect #WheelTurningSimulation #Perception #Sensor #Simulation and #Model #Validation
To view or add a comment, sign in
-
Exciting Announcement from ISPRS Geospatial Week #gsw2023 Conference! I'm thrilled to introduce Mono-Hydra, our innovative extension to the Hydra framework. Venturing into the relatively uncharted domain of monocular camera setups for real-time spatial perception systems, Mono-Hydra bridges the gap between compact agility and real-time 3D scene graph generation. 🔍 Highlights: 1️⃣ Sub-20 cm error in real-time at 15 fps on NVIDIA 3080 GPU with a monocular camera setup. 2️⃣ Incorporates deep learning algorithms for depth and semantics predictions and a classical VIO algorithm - RVIO2. 🔗 Dive into our paper: https://lnkd.in/dxGf8aXj A huge shoutout to Prof. Francesco Nex and Prof. George Vosselman for their invaluable guidance. Gratitude to our team and the #GSW2023 community – let's keep pushing boundaries! 🚀 https://lnkd.in/d3tgPwyd #MonoHydra #uavcentre #UT #ITC #Robotics #SpatialPerception #Hydra #RVIO2 #DeepLearning
To view or add a comment, sign in
-
🌟 Sneak Peek Friday: Development in SMDL and Simspector! 🚗🌲 In this week's episode of Sneak Peek Fridays, we're excited to highlight a significant advancement in our sensor model development library (SMDL) and our data analysis and visualization tool, Simspector. Both can now work with alpha textures and normal maps! This enhancement is a game-changer for our simulation capabilities, providing both better performance and higher fidelity. Three months ago, we shared a video analyzing the 3D environment and moving objects for geometry (meshes/primitives) and material assignments. Now, with the integration of alpha textures and normal maps, we've taken a huge step forward in simulation detail. This improvement allows us to generate the lidar point cloud more accurately in complex environments, such as vegetation alongside the road. Watch the video below to see the impressive effects of these enhancements in action. The video showcases the exact same exemplary scene we created in-house, depicting a German highway, but now with significantly more detail on the vegetation. You'll notice a marked difference in the realism and precision of our simulations, showcasing the superior performance and fidelity we have achieved. We are thrilled with this development and look forward to continuing to push the boundaries of sensor simulation technology. Stay tuned for more updates and innovations from Persival! #SneakPeekFridays #SMDL #Simspector #AlphaTextures #NormalMaps #Perception #Sensor #Simulation and #Model #Validation
To view or add a comment, sign in
-
-
What do virtual reality and augmented reality smart glasses, automobile Lidar, cell phone camera lenses, night vision, terrain mapping, and facial recognition have in common? All are technologies dramatically improved by metalenses—extremely thin optical structures that combine multiple functions of traditional and bulky curved optics into an ultracompact package. Now, development and adoption of these technologies is poised to accelerate substantially in Massachusetts, thanks to a $5 million award from the Massachusetts Technology Collaborative (MassTech), announced earlier this month. The grant will enable the University of Massachusetts Amherst to establish an open-access Advanced Optics Fabrication and Characterization Facility on its campus, a unique resource available to industry partners and researchers across the state. The MassTech grant also funds a major expansion of industry partner Electro Magnetic Applications, Inc.'s test and characterization capabilities to evaluate these new designs in real-world and harsh environments. Learn more: https://bit.ly/47a9FtF
To view or add a comment, sign in
-
-
Another example of why we us NavVis as one of our fundamental foundation tools for Digital Twins. Reach out if you're interested to learn more about how Digital Twins and BIM can benefit your next project.
We're honored to be mentioned in the recent NVIDIA blog post "Accelerating Data Center Design With Digital Twins", which showcases how #rapidinnovation is made possible using digital twin technologies. The #realitycapture process starts with mobile lidar scanning using the revolutionary NavVis VLX 3 system. From there, Prevu3D was utilized to create an accurate 3D model in NVIDIA Omniverse. This enabled the virtual removal of the existing hardware and rapid design, optimization and installation of the new compute and network infrastructure - finding and fixing potential issues before they even occur! Come see us at booth 531 in the Industrial Digitalization Pavilion at #GTC24 to meet our service delivery partners NavVis and Prevu3D, and see these exciting technologies in action! #nvidiagtc #digitaltwin #digitaltwins #ai #artficialintelligence #syntheticdata #scanning #laserscanning #slam #lidar #lidartechnology #industrial #industrialengineering #industrialai #realitycapture #computervision #nvidiaomniverse
Accelerating Data Center Design With Digital Twins
To view or add a comment, sign in
-
#Mevea’s Senior Technical Advisor, Dr. Asko Rouvinen presented at the 2nd Autonomous Off-Highway Machinery Technology Summit (Feb 21-22, 2024) how advanced simulation, i.e. physics based #DigitalTwin can be used in assistive and autonomous system development and what benefits could be achieved. 👏😊 Two concrete examples presented were SLAM (Simultaneous Localisation and Mapping) application development and real-time simulation of structural flexibility. In the video the wheel loader with two LiDARs and odometer sensor is running in Unity visualization. Sensor signals utilize standard ROS (Robot Operating System) messages. LiDAR point cloud visualization includes deformable soil on ROS 3D Rviz visualizer. 💪 The link to the full article is at comments. 👍 #automation #OEM #AOMT2024 #construction
To view or add a comment, sign in
-
Nvidia ambassador ||Building Nex-Dynamics|| nvidia isaac ros || ROS developer || Diploma in Mechanical Engineering || B.Tech Robotics & Automation (UG) || Student at Karunya Institute of Technology and Sciences
https://lnkd.in/gpJYMGRq exploring Omniverse isaac Ros simulation using urdf robot model. In this tutorial you will learn. 1.) Import urdf model into Omniverse environment. 2.)Add lidar sensor and camera. 3.)ROS bridge and visualise data in ros. 4.)Interface isaac ros with opencv. #nvidiaomniverse #nvidia
Exploring Omniverse ISAAC ROS Simulation with URDF, Sensors, and OpenCV Integration
https://www.youtube.com/
To view or add a comment, sign in
-
We're honored to be mentioned in the recent NVIDIA blog post "Accelerating Data Center Design With Digital Twins", which showcases how #rapidinnovation is made possible using digital twin technologies. The #realitycapture process starts with mobile lidar scanning using the revolutionary NavVis VLX 3 system. From there, Prevu3D was utilized to create an accurate 3D model in NVIDIA Omniverse. This enabled the virtual removal of the existing hardware and rapid design, optimization and installation of the new compute and network infrastructure - finding and fixing potential issues before they even occur! Come see us at booth 531 in the Industrial Digitalization Pavilion at #GTC24 to meet our service delivery partners NavVis and Prevu3D, and see these exciting technologies in action! #nvidiagtc #digitaltwin #digitaltwins #ai #artficialintelligence #syntheticdata #scanning #laserscanning #slam #lidar #lidartechnology #industrial #industrialengineering #industrialai #realitycapture #computervision #nvidiaomniverse
Accelerating Data Center Design With Digital Twins
To view or add a comment, sign in