Wireless charging is a challenging technology. While the benefits are clear as day (imagine, only having the one cord to supply power to the charger itself!), the concept has yet to see the mass implementation and adoption one would expect it to. Today, we’ll look at this underperforming technology’s history, a few applications that could be seen in the future, and the holdups it is seeing currently.
The Charged-Up History of Electricity
As one might imagine, wires have been getting in people’s way since electricity was a widely adopted thing. This was one of the conundrums that Nikola Tesla, the underappreciated inventor and engineer, spent a considerable amount of his life trying to solve. Eventually, he devised a device that would leverage magnetic fields to transfer electricity, a device we know today as the Tesla coil.
Unfortunately, some guerilla marketing from his electric rival, Thomas Alva Edison, turned the public opinion away from Tesla’s approach and towards Edison’s. Basically, Edison took a murderous circus elephant named Topsy that was to be put down and volunteered to do it with Tesla’s electricity. However, not even the film Edison produced of Topsy being exposed to alternating current (a key facet to Tesla’s European-inspired approach) was quite enough to prevent AC from becoming the standard.
Eventually, Tesla’s coil was improved upon further. It ultimately found a place in contemporary radar systems. Yet, even as the technology became more powerful, it was difficult for a market to be found wherein to use it. This had the apparent effect of wasting the work done by the people of multiple private-sector businesses, as well as NASA and the U.S. Department of Energy.
Enter the Smartphone
One of the biggest obstacles to the progress of wireless charging was the fact that, while wireless charging is logically connected to mobile devices, there was a lack of truly mobile devices until recent years. Cellular phones changed that, as the device was now meant to be fully portable, or in other words, mobile. Other devices quickly followed suit with charging capabilities.
However, even this revitalization of wireless charging due to mobile phones couldn’t change one fact: there was always going to need to be a wire. Even the most recent wireless chargers need to be plugged in before they will work, and really, what’s the real difference between plugging in a device, and setting it down just so?
What it Means to Really Be Wireless
In order for any technology that we develop to be marketable, it’s pretty much required to meet two key criteria: it has to work, and it shouldn’t do too much damage to the user.
While this might seem like a relatively low bar, that bar has yet to be met, or even a product introduced that tries. This lack of trying doesn’t reach quite all the way up the chain, however. Companies like Energous in Silicon Valley, Ossia from Bellevue, Washington, and uBeam from Santa Monica, California, have all worked on technologies that do meet the criteria outlined at the beginning of this section. As they find more success in developing uncoupled power solutions, a new paradigm takes form.
Someday, it may not be uncommon to see wireless charging spreading to devices other than smartphones. More and more IoT devices, like wearables, as well as medical devices, like hearing aids, may someday not need a dedicated charge cycle during which it can’t be used. However, until then, we will all have to wait.
Until that day comes, wireless charging will either be ineffective or underutilized… but what devices would you like to be able to charge wirelessly in the future? Tell us which and why in the comments.