Tuesday, May 05
Carl W. Taylor, IIA Ambassador and Director of the Center for Strategic Health Innovation at the University of South Alabama College of Medicine, attended a recent telemedicine conference in Las Vegas. Here’s his report:
This is not the blog I intended to post from the American Telemedicine Association meeting in Las Vegas. I was going to blog on the rapid emergence of large corporations into the telehealth arena. To be sure there are still small to mid size company innovations such as InTouchHealths robotics and Vecna’s health kiosk and deployable field ready health stations. But the real buzz this year in the growing presence of companies like Cisco, Bosch and Intel. After 4 decades of being the odd rounding error in the healthcare industry, telehealth may finally be ready to emerge as a valid and commercially viable delivery option. Now my preference will be to remain hopeful that the e-health, HIT, EHR, Health 2.0 , informatics, and telehealth industries will consolidate or at least learn to play nicely together so we dont go from paper silos of fragmented care to digital silos of fragmented care. At the very least the emergence of large companies whose portfolios also include e-health and Health 2.0 strategies should push this consolidation. The ATA conference is also a bit unique in that it grew this year despite a contracting economy and though there were a few no shows due to travel restrictions in state budgets, the lure of stimulus dollars and expanded industry presence made for an outstanding turnout.
Now, let me leave traditional healthcare delivery behind and talk about what I think is an even more timely issue. As I write this we are on the front end of trying to understand the H1N1 Swine Flu outbreak. Half of my day job is teaching disaster preparedness and deploying a situational awareness software tool to over 1,000 healthcare users (we give it away so this isnt a pitch). We have worked hurricanes, tornadoes and other natural disasters for several years in the Deep South. For the response to those events our state and federally managed system of human volunteer deployment works well. However, I believe there will be challenges to the movement of human volunteers into areas of any widespread outbreak. Widespread infectious disease outbreaks highlight the need for a broadband communication response system of virtual medical care delivered to the point of need. What will be needed if not in this outbreak, certainly in others, will be the need to provide robust real time one to one, peer to peer and one to many healthcare.
In many parts of this country specialists with critical skills are embedded in large urban medical or university settings. Widespread outbreaks, particularly those with animal vectors are just as likely to be found in rural settings. Consequently connecting the specialists virtually to the patients without the need of either to travel is simply a paramount requirement. There are good examples of networks poised to deliver this kind of virtual care such as the Montana Infectious Disease Network, whose work was presented at the ATA Disaster Special Interest Group program yesterday morning (disclaimer I had a very small role in establishing that network). I recognize building a business plan around low frequency but high severity disasters is difficult, but once established these networks can, and should also be able to support daily healthcare needs as well. Regardless, as we consider the emerging consequences of this event the need to develop regional virtual disaster medical assistance teams with robust communication connectivity becomes apparent.
With cybersecurity currently a hot topic in Washington, D.C., researchers at the University of Santa Barbara have taken the step of hijacking a botnet in order to see just how much damage it does. Ars Technica has the scary scoop:
UCSB’s researchers were able to gather massive amounts of information on how the botnet functions as well as what kind of information it’s gathering. Almost 300,000 unique login credentials were gathered over the time the researchers controlled the botnet, including 56,000 passwords gathered in a single hour using “simple replacement rules” and a password cracker. They found that 28 percent of victims reused their credentials for accessing 368,501 websites, making it an easy task for scammers to gather further personal information. The researchers noted that they were able to read through hundreds of e-mail, forum, and chat messages gathered by Torpig that “often contain detailed (and private) descriptions of the lives of their authors.”
The University’s full research paper is available here.
The New York Times is reporting that the FTC—that would be the Federal Trade Commission—is investigating whether Apple and Google are a little too cozy:
Apple and Google share two directors, Eric E. Schmidt, chief executive of Google, and Arthur Levinson, former chief executive of Genentech. The Clayton Antitrust Act of 1914 prohibits a person’s presence on the board of two rival companies when it would reduce competition between them. The two companies increasingly compete in the cellphone and operating systems markets.
Antitrust experts say the provision against “interlocking directorates,” known as Section 8 of the act, is rarely enforced. Nevertheless, the agency has already notified Google and Apple of its interest in the matter, according to the people briefed on the inquiry, who agreed to speak on condition of anonymity because the inquiry was confidential.
The FTC’s investigation is still in its infancy and, according to the story, nobody is yet commenting. Stay tuned…
How will a national broadband policy succeed? Telephony Online has a long article attempting to answer the question:
Successful National Broadband Policies across the globe have three distinct features: (1) Definitive goals to provide “x” bandwidth to “x” percent of population by “x” date; (2) some form of government financing; and (3) telecom policy that supports the goals of the plan. In addition, many of the plans also have specific goals related to broadband adoption, not just availability, and develop government policy and programs to support those goals.
Another key element of most National Policies is the fact that a market analysis detailing the competitive environment, the market position of the incumbents, availability and affordability of broadband has been undertaken ahead of policy making.
The full article is definitely worth checking out.
A new national broadband strategy has been offered, and as GovTech reports, it’s being offered by the nation’s universities:
Last week, 200 universities nationwide offered a national strategy to the Obama Administration “as a first step in realizing (his) vision bringing the benefits of broadband technology to all Americans.”
The plan was offered to NTIA—The National Telecommunications and Information Administration—which has $4.7 billion to help build our national information infrastructure as part of the so-called stimulus plan passed by the Congress earlier this year.
As for the plan itself, Blandin on Broadband nutshells it:
A National Broadband Strategy should begin with America’s colleges and universities, community colleges, K-12 schools, public libraries, hospitals, clinics, and the state, regional and national research and education networks that connect them and extend to reach government agencies, agricultural extension sites, and community centers across the nation.
The full plan—titled “Unleashing Waves of Innovation: Transformative Broadband for America’s Future—is available in pdf form.
From the Sacramento Bee:
The good news is that 96 percent of California’s households have access to a high-speed Internet connection.
The bad news is that despite the good news, 45 percent of California residents – a number greater than the populations of all but five states – still don’t have broadband connections in their homes because of geography, disabilities, a lack of English language skills or poverty.
Now the promising news: The state is poised to grab as much as $1 billion in federal stimulus money for closing what’s referred to as a “digital divide” between Internet haves and have-nots.
With stimulus dollars still up for grabs, expect more states to try and get in on the action.
Monday, May 04
Amazon is about to release a third version of its popular Kindle, this time aimed toward newspaper and magazine readers.
Whether the device—which reportedly has a larger screen—will be a hit with dead tree diehards remains to be seen. But given the overwhelming cost of traditional printing, the future of journalism is definitely online.
Last week, the Obama administration announced it was venturing into social networking by joining sites such Facebook, MySpace, Twitter, and Flickr. From the official White House blog:
In the President’s last Weekly Address, he called on government to “recognize that we cannot meet the challenges of today with old habits and stale thinking.” He added that “we need to reform our government so that it is more efficient, more transparent, and more creative,” and pledged to “reach beyond the halls of government” to engage the public.
While the government certainly needs shake off “old habits and stale thinking,” the move into social networking isn’t without concerns from privacy groups. As the New York Times “Bits” blog reports:
The privacy advocates’ biggest concern is that most social networks treat a government agency no differently than a former roommate. People might friend the White House on MySpace, for example, to indicate support for the president or to get messages about what the administration is doing. In doing so, however, they are agreeing that every party photo, love poem, and wisecrack from a friends that appears on their profiles will be visible to White House Web masters. And so far there are no guidelines that say whether those Webmasters might keep copies of any of personal information they see or send it to the government officials who could use it to get authorization to audit people’s taxes, keep them from boarding an airplane, tap their telephones or even arrest them.
In response to concerns, the White House had this to say:
“We are focused on opening government to the people (and not the other way around), and like with any other online friends, the individual users can still choose to keep information private using their privacy settings,” said Moira Mack, a White House spokeswoman in an e-mail. “The White House takes privacy seriously and we are engaged in an ongoing conversation with privacy advocates to ensure that we are aware of the latest concerns and issues.”
According to a new report from marketing research firm In-Stat, web-to-TV video streaming is about to explode—to the tune of 24 million households with five years.
This, obviously, poses a problem to traditional TV providers—and might explain why some cable companies have been making noises about metered broadband. Will cable companies turn to cap limits to make up for lost revenue?
Search engines have come a long way, but as The Independent reports, a revolution may be on the horizon:
The new system, Wolfram Alpha, showcased at Harvard University in the US last week, takes the first step towards what many consider to be the internet’s Holy Grail – a global store of information that understands and responds to ordinary language in the same way a person does.
Being able to make Internet searches more personable is cool and all, but Wolfram Alpha has something else up its sleeve:
The real innovation, however, is in its ability to work things out “on the fly”, according to its British inventor, Dr Stephen Wolfram. If you ask it to compare the height of Mount Everest to the length of the Golden Gate Bridge, it will tell you. Or ask what the weather was like in London on the day John F Kennedy was assassinated, it will cross-check and provide the answer. Ask it about D sharp major, it will play the scale. Type in “10 flips for four heads” and it will guess that you need to know the probability of coin-tossing. If you want to know when the next solar eclipse over Chicago is, or the exact current location of the International Space Station, it can work it out.
An Internet that not only stores information but can also work out problems? What could possibly go wrong?