The move is driven by physics and economics. Even when data travels at the speed of light, the time it takes to send packets halfway around the world to
one central location is noticeable by users whose minds start to wander in just a few milliseconds.
However,
edge computing will continue to be limited by countervailing forces that may, in some cases, be stronger. Datacenter operators are able to negotiate lower prices for electricity and that typically means right next to the point of generation like a few miles from some large hydroelectric dams. Keeping data in multiple locations synchronized can be a challenge, and some algorithms like those used in machine learning also depend heavily on working with large, central collections.
Despite these challenges, many architects continue to embrace the opportunity, thanks to the efforts of
cloud companies to simplify the process.
The
ultimate edge location, though, will continue to be in the phones and laptops. Web app developers continue to leverage the power of browser-based storage while exploring more efficient ways to distribute software.
>> Read more. [2] Recent
advances in quantum computing show progress, but not enough to live up to years of hyperbole. An emerging view suggests the much-publicized quest for more
quantum qubits and quantum supremacy may be overshadowed by a more sensible quest to make practical use of the qubits we have now.
However,
D-Wave's annealing qubits don't have the general quantum qualities that competitive quantum gate-model systems have, and the degree of processing speed-up they provide has been questioned. D-Wave's qubit counts have been faulted by critics for specializing in a purpose-built approach aimed at a certain class of optimization problems.
Still, the company has a leg-up with its experience compared to most competitors, having fabricated and programmed superconducting parts since at least 2011.
The
gate-model quantum computing crew's benchmarks have come under attack, too, and its battles with scaling and quantum error (or "noise") correction have spawned the term "noisy intermediate-scale quantum" (or NISQ) to describe the present era, where users have to begin to do what they can with whatever working qubits they have.
While it will continue to work on its
annealing-specific quantum variety, D-Wave has joined a gate-model quantum competition where there appears to be plenty of room for growth.
>> Read more. [3] My first experience in
a virtual world was in 1991 as a PhD student working in a virtual reality lab at NASA. I was using a variety of early VR systems to model interocular distance (i.e., the distance between your eyes) and optimize depth perception in software. Despite being a true believer in the
potential of virtual reality, I found the experience somewhat miserable.
Even when I used early 3D glasses (i.e., shuttering glasses for viewing 3D on flat monitors), the
sense of confinement didn't go away. I still had to keep my gaze forward, as if wearing blinders to the real world. There was nothing I wanted more than to take the blinders off and allow the power of virtual reality to be splattered across my real physical surroundings.
Cut to 30 years later, and the phrase "
metaverse" has suddenly become all the rage. At the same time, the hardware for virtual reality is significantly cheaper, smaller, lighter, and has much higher fidelity. And yet, the same problems I experienced three decades ago still exist. Like it or not, wearing a scuba mask is not pleasant for most people, making you feel cut off from your surroundings in a way that's just not natural.
This is why the metaverse, when broadly adopted, will be an
augmented reality environment accessed using see-through lenses. This will hold true even though full virtual reality hardware will offer significantly higher fidelity. The fact is, visual fidelity is not the factor that will govern broad adoption. Instead, adoption will be driven by which technology offers t
he most natural experience to our perceptual system.
>> Read more.
No comments:
Post a Comment