Applied Technology Review : News

Advancements in XR technologies, such as VR, AR, and BCIs, are key drivers in shaping the human experience within the metaverse.   The phrase metaverse has been quite well known recently, prompting a debate concerning its definition, reality, and potential ownership on a global scale. Despite all the noise around it, there is still no consensus on what the metaverse actually is. A more practical approach would be to envision how it may appear and feel to foresee the potential socioeconomic changes it might bring rather than getting bogged down in the details of the metaverse's functionality. The advancements made in many technical fields over the previous years will pave the ground for the initial steps in creating the metaverse if technologists are right that 2022 will split thinkers from builders. Implementing the metaverse will be aided by improvements in graphics processing units (GPUs), photorealistic 3D engines, volumetric video, and AI, enabling quicker content generation, as well as by the broad adoption of cloud computing and 5G and the development of blockchain infrastructure. Among all these technological developments, extended reality (XR) technologies have a significant impact on how people interact with the metaverse. As the upcoming computing platforms, XR includes virtual reality (VR), augmented reality (AR), and brain-computer interfaces (BCI). By 2024, shipments of VR and AR headsets will be predicted to surpass those of game consoles worldwide due to XR's quick growth. These technologies can transform digital experiences and act as the main access points for the general public to the metaverse. The rivalry among XR technologies is strong, with businesses thinking and making significant investments to control the industry. Some contend that only some aspects of human experience can be replaced by virtual reality (VR), and they caution against the concentration of power in the hands of a few VR devices and content providers, which could result in "walled gardens" like the internet as it exists today. Web 3 proponents, on the other hand, foresee a metaverse that subverts the domination of tech giants and offers a decentralized internet experience, control, and monetization, empowering users and content producers. This metaverse concept aims for a more inclusive and open digital environment than the closed ecosystems developed by major tech companies. BCI plays a larger and more futuristic role in the metaverse. BCIs are intended to completely replace the physical interfaces now used by XR technologies, which rely on displays and conventional input techniques. Innovative technologies like neural links, which need to be implanted into the brain via neurosurgery, present fascinating possibilities and cause reluctance and anxiety in potential users. ...Read more
Cambridge, UK –  TEKEVER is a global leader in unmanned systems technology. CRFS is a pioneer in building ultra-sensitive RF receivers for spectrum monitoring and geolocation.   Together, the companies have successfully completed phase one of their system integration partnership and launched the first sub-tactical unmanned aerial system (UAS) carrying highly sensitive RF sensors as a payload.  Download the complete deployment story   The TEKEVER AR5 has an endurance range of 20 hours, a payload capacity of 50 kg, and a cruise speed of 100 km/h. The RFeye Node is a lightweight and rugged RF receiver with a 100MHz IBW and a frequency range of up to 40GHz. Integrating an RFeye Node into an AR5 allows teams to geolocate ground-based targets situated beyond the horizon—vastly increasing ISR capabilities.   Capable of taking off from short, unpaved airstrips, the wide-area surveillance AR5 is easily deployed. When integrated with ruggedized RF sensors that have an IP67 form factor and are optimized for SWaP, the UAS offers unparalleled spectrum monitoring, detection, signal capture, and geolocation (TDoA) capabilities.  The partnership between TEKEVER and CRFS gives end-users an asset covering vast land or sea areas with many potential applications—maritime surveillance, search and rescue, border monitoring, military ISR, and even regulatory spectrum monitoring. Thanks to the altitude at which the drone operates, the increased signal collection radius results in unprecedented operational range gains—enabling new concepts of operation.  For advanced capabilities, combining the integrated UAS with existing ground-based units allows users to create an adaptable multidomain network of receivers for superior passive ISR over huge areas. This is particularly important in active combat zones, as increasing altitude allows signals to be detected at greater distances—further from the front line.  Dr Pio Szyjanowicz, COO of CRFS, said: “To make this happen, our engineering teams have combined their ingenuity and agility to overcome the technical challenges that are inevitable when integrating high-performance electronics systems on an airframe. One of the most significant was that UAS have a significant number of transmitters onboard, that have the potential to interfere with the highly sensitive RFeye receiver payload. Achieving the optimal solution in terms of antenna position and RF filtering is just one example of the excellent teamwork between TEKEVER and CRFS.”  Tiago Nunes, Product Director at TEKEVER, said: “The groundbreaking partnership between TEKEVER and CRFS is a testament to the power of collaboration. It’s a game-changer, offering end-users an incredibly versatile asset that can cover vast land and sea areas. The possibilities are limitless, from maritime surveillance, search and rescue, border monitoring, to military ISR, and even regulatory spectrum monitoring.  This partnership showcases the remarkable synergy between the two companies, exemplifying their dedication to pushing the boundaries of innovation. It is a perfect union of expertise and technology, resulting in a solution that exceeds all expectations.”  ...Read more
The benefits of fixing acoustic sensors on modern vehicles can further minimise risks and ensure safety with their advanced functions. Modern vehicles are equipped with an abundance of sensors, and these typically include multiple ultrasonic range sensors for parking and object detection, visual cameras for spotting traffic and pedestrians, LiDAR for precisely measuring the distance between nearby vehicles for use with adaptive cruise control, and in some cases, RADAR for detecting traffic at a distance. All drivers and pedestrians will experience much safer driving because of the ability of vehicles to operate in all types of weather, owing to the use of multiple sensing technologies. Researchers Developing Acoustic Sensors for Vehicles Researchers have begun to develop both acoustic sensors for vehicles and an AI-based system that will be able to improve acoustic event recognition in recognition of the importance of acoustic data in vehicles. Although nothing is known about the sensors themselves–as they probably build on already-existing technologies like electret mics–their news release gave information on how much audio data can be useful in automobiles. Environmental stimuli like the sound of wet roads or ambulance sirens can give drivers important information. Intelligent acoustic sensor systems are useful as conventional car sensors frequently fail to pick up these sounds. In order to provide pertinent information for driving manoeuvres or preventive maintenance, these systems strive to give automobiles a sense of hearing. They collaborate with other driver-aid systems. The acoustic turn assistant is one example of how acoustic sensors could be used in cars. This feature uses acoustic sensors to listen for any potential sound sources in blind zones, such as other vehicles and pedestrians. Acoustic sensors can also use engine noises to identify objects because reflected sounds can amplify a car's perceived loudness. The press statement cites acoustic rear-view cameras as another example of how acoustic sensors may be useful in automobiles. Simply put, microphones installed in the back of the vehicle can be utilised to improve situational awareness instead of drivers having to open windows to hear whether pedestrians are nearby. For instance, it will be possible to audit someone screaming at a car to halt. Sensors Changing Vehicles for the Future Although adding acoustic sensors to automobiles may have some significant advantages that might result from doing so. Early warning systems for emergency vehicles, which activate a chime or dashboard light when a distant emergency vehicle is heard, are one prospective future feature for cars. This is particularly crucial for drivers who are dealing with noisy kids, loud music, or hazardous road conditions. In the future, there may be a growing need for vehicles to employ such sensors to enhance external sound perception. Automakers are continually striving to reduce cabin noise to enhance the overall driving experience. However, this effort to minimise internal noise could potentially diminish the sounds drivers need to hear. Hence, the integration of external acoustic sensors with AI has the potential to amplify crucial external sounds while filtering out engine and traffic-related noise. ...Read more
follow on linkedin
Copyright © 2026 Applied Technology Review.All Rights Reserved
Top