Thursday, November 22, 2012

Dish TV wants to become Dish TV and Mobile

The Financial Times reports that Dish has won FCC approval to use spectrum they already own for LTE cellular communication rather than TV broadcast.

The approval came with a caveat regarding power limitations so as to avoid interference with adjacent spectrum that the FCC plans to auction next year. (That spectrum will also be used for LTE).

Dish said it was not all good news since the power restrictions could "cripple our ability to enter the business." Perhaps they are hoping to negotiate with the FCC over those restrictions.

The Wall Street Journal has speculated that Dish might partner with Google in forming a mobile communication company. Or perhaps Dish will sell the spectrum, which is now worth more than it was before the FCC approval.

Dish's move is reminiscent of the recent case in which the FCC turned down LightSquared, a startup seeking to offer LTE service using spectrum adjacent to GPS frequencies. LightSquared has a new proposal before the FCC, asking permission to share frequencies that are used by weather balloons.

Dish might provide some competition for the mobile cartel, but, then again, they have not exactly driven the price of broadcast TV down. But there are others as well. Google may join in with or without Dish, LightSquared may get a second chance, T-Mobile and Metro PCS want to compete and virtual mobile operators like Virgin and Ting are offering cut-rate prices. The cartel may be weakened.

Sunday, November 18, 2012

Republican critique of copyright and patent system withdrawn

The Well, an early online community, had a saying “you own your own words.”

What you say online may come back to haunt you, so think twice before posting something controversial.


Representative Jim Jordan of Ohio learned that lesson on November 16, when the Republican Study Committee (RSC), which he chairs, published “The Three Myths of Copyright,” a well reasoned critique of the copyright and patent systems and their impact on the economy.

The critique was consistent with Jordan’s view of the government and free enterprise, but it disappeared from his Web site soon after it was published.

But, Lauren Weinstein had made a copy of the RSC document and published it on his excellent blog. Note that he has marked it as “withdrawn,” since it is no longer on Jordan’s Web site.


Jordan has learned a lesson about Internet publication – you own your own words and they are difficult to erase.

And we citizens have gotten another look at the morality of politics and the rapidity with which principle and reason can be set aside.

PowerPoint presentation for teaching


Addenda

12/7/2012
The Republican Study Committee, a caucus of Republicans in the House of Representatives, has told staffer Derek Khanna that he will be out of a job when Congress re-convenes in January. The incoming chairman of the RSC, Steve Scalise (R-LA) was approached by several Republican members of Congress who were upset about a memo Khanna wrote advocating reform of copyright law. They asked that Khanna not be retained, and Scalise agreed to their request.

1/11/2013
ARS Technica interview of Derek Khanna, who was fired.

Friday, November 16, 2012

I blew it -- Twitter was cool on election night


In a recent post, I argued that The AP's interactive results map was better than Twitter, hangouts or a TV stream for watching the election returns online because the map was interactive, putting the user in charge.  I still like the map a lot and have not changed my mind about TV or the hangouts, but I sold Twitter short.

The problem was that I only followed one person's Twitter feed -- Andy Carvin.  I picked Andy because he practically invented the notion of reporting events via a live Twitter stream during the Arab Spring.

I said I found Andy's election coverage boring and often uninteresting to me, but I overlooked the fact that Twitter is social.  I should have followed many feeds, not just Andy's.  For example, if I had been following the candidate's feeds, I would have seen President Obama's victory tweet, the most re-tweeted post ever.

Twitter has published other statistics and memorable tweets.  It turns out they hit a rate of 327,452 tweets per minute, and did not crash.  (They were known for crashing under load in the past).

A terrorist could jam an LTE base station for $650

Click to enlarge and read
Virginia Tech professor Jeffrey H. Reed and Marc Lichtman, a graduate student, filed a comment with the National Telecommunications and Information Administration showing that a terrorist could crash an LTE base station serving thousands of people by jamming control signals using a software defined radio and laptop.

They were commenting because there is a plan for a nationwide public safety broadband network that would use LTE wireless technology.

A Technology Review article on the filing estimates the cost of the jamming equipment at $650 and points out that 2G and 3G wireless would continue to work -- but 2 and 3G wireless are relatively will be phased out.

Reed and Lichtman do not offer a solution -- they merely point out the vulnerability.

Monday, November 12, 2012

A quick look at the use of the Internet in the 2012 election

In an earlier post, we looked at the election coverage on the Internet. Now, we take a quick look at the way the campaigns used he Net.

As in the last election, both campaigns used Facebook, Twitter and other social networks, but this time they moved on to the targeted advertising we are now used to on the Internet -- white males saw different campaign ads than their wives. (See articles in both The Economist and The New York Times.

Perhaps Obama was a more aggressive in tracking clicks than Romney.  The Times checked Obama and Romney's Web sites during the campaign, and found that Obama was using 76 click tracking services and Romney 40.  I checked yesterday and found that Obama was down to 32 and Romney only one (click on the image to the right to see which ones).
In another article on the use of the Internet in the election, The New York Times gave the Obama campaign the edge in their use of the Net in organizing volunteers for door to door canvasing, phone calls and fund raising. They did that using a Web site called Dashboard and mobile apps that could access it.

A couple of factoids to establish context: The Economist quotes Borrell Associates as estimating online ad spending in 2012 at $160 million, six times what it was in 2008, but it remains a small percent of the estimated $6 billion spent on the election. This understates the impact of online campaigning, because it costs very little -- the Obama campaign built about 200 different programs that ran on Amazon's cloud services.

Wednesday, November 07, 2012

NPR's interactive map tops election night coverage on the Internet

How did you watch the 2012 election results? I watched on my laptop. As shown below, there were several approaches to the coverage -- local TV stations holding hangouts, streaming TV coverage from ABC News, live tweeting by Andy Carvin (@acarvin) and an interactive map on the NPR Web site.

Which to watch? I tried them all and frankly found three to be boring.

ABC's coverage consisted of periodic local and national vote updates with "pundits" talking about what all meant.  I found it slow and much of it was irrelevant to me. Streaming linear TV online is liking making a movie by setting camera on a tripod and recording a stage play.  Old wine in a new bottle.

I found that Andy Carvin's tweets came in too slowly and, like ABC's stream, often concerned things I was not interested in. Carvin has been live tweeting events in the Middle East for a couple of years and is probably our best, most experienced live tweeter. (See his book Distant Witness). If he can't make live tweeting of election results work, the medium is probably not a good fit.

(Correction after posting -- I blew it -- Twitter was cool -- I should have followed more than Andy Carvin).

I found several hangouts in which a local TV reporter discussed the election with the public and found them boring and uninformative. I'd rather listen to pundits.

For me, the clear winner was NPR's interactive map. In retrospect, that is no surprise. The key is that it is interactive. Unlike the others, it let me be active, determining what I would see.

As shown below, the map page is divided into three sections. The largest is a map of the US. Above that is a graphic summary of the current state of the presidential, senate, house and gubernatorial races. Interactive results were displayed to the left of the map.

The summary at the top has small red, blue and gray spots -- red signifying a decided Republican victory, blue a decided Democratic victory and gray undecided. The bar below that summarized the state of the presidential election at that time.

The dark red and blue areas signify electoral votes won by the two candidates. The light red and blue areas signify likely electoral votes and the gray area in the middle votes that were too close to call. The image shown above was snapped Tuesday evening, when the race was close. By Wednesday morning, the summary showed that Obama had won the presidency, the Democrats controlled the Senate and the Republicans had retained control of the House and won a majority of the gubernatorial elections.

You could drill down by clicking on a state.  Below I clicked on California and the current vote counts in presidential, senate, house and gubernatorial elections were for the state were displayed to the left of the map.

One could drill down further. The results as of Wednesday morning for the House race in California's 30th District are shown below.

You could also check the tally of ballot initiatives for each state. California's are shown below.

I don't want to leave the impression that NPR was perfect.  The maps were poorly rendered and I had some quibbles with the user interface, but, for me, NPR's interactive map was the election coverage winner.  This reminded me of my comparison between the BBC and NBC coverage of the 2012 Olympic Games. BBC's coverage was more interactive than NBC's. NBC's was more like watching television. There is a general lesson here -- the Internet is an interactive medium. 

------


Added after posting

The NPR site is displaying data from the Associated Press.  I turns out that Google also presented the same data (http://bit.ly/Up9c0D) and it was also posted on the C-SPAN Web site (http://cs.pn/XnY58x).

Monday, November 05, 2012

Pearson announces Operation Blue Sky, a discovery service for open educational resources

There are tons of useful, open educational resources(OER) -- ranging from Creative Commons textbooks to narrowly focused videos, images, presentations and other material. The problem is that it is difficult for a teacher to discover these free jewels.

Pearson hopes that their newly announced Operation Blue Sky will fill that role. The site is not yet live, but the image shown below illustrates its user interface.


OER discovery is a tough nut to crack. Services like MERLOT have tried for years, but what percent of your teaching material did you find on MERLOT?

I personally am looking for much finer grained OER than an open textbook or video. How about a cool image, quote or anecdote to illustrate a point I am trying to make in class?

Pearson says Project Blue Sky will allow instructors "to search, select, and seamlessly integrate Open Educational Resources with Pearson learning materials." The last part about seamless integration indicates their motivation, and, if they are not careful, their Achilles heel. I am not interested in closed silos.

That being said, I am hoping that Pearson will be able to crack the OER discovery nut and will be keeping an eye on the effort.

#digilit #jiscdiglit #highered #edreform #MOOC #pedagogy #EDUCAUSE #bonkopen #merlot #Pearson

-----

Pearson Project Will Let Professors Mix Free and Paid Content in E-Textbooks - Wired Campus - The Chronicle of Higher Education.

Fostering interaction and spontaneity in a MOOC -- in-class students as co-stars

MOOCs have shown that the presentation of material scales dramatically -- thousands of people are willing to watch interactive video presentations -- but we have not demonstrated the ability to scale interaction.

One approach is to encourage peer interaction using threaded discussion, peer grading, social media and face-face meetings in study groups.

But, can we also find ways to scale the sort of spontaneity and interaction that takes place in a face-face (FF) classroom? I doubt that we can ever achieve the level of exchange and enthusiasm in an outstanding classroom session, but those are atypical. A more realistic goal would be to scale interaction to the level that occurs in an average FF class session. I think (hypothesize) that is achievable. Let me give a couple of examples of attempts at using classroom interaction in a MOOC -- one that worked and one that did not.

I am currently dropping in on, though not taking with any discipline, "A History of the World since 1300" offered through Coursera by Princeton professor Jeremy Adelman.

In addition to typical video lectures with breaks for quizzes, they have tried to bring in some classroom interaction using "global dialogs," in which Professor Adelman and a guest scholar hold a conversation in front of a FF class. I've only watched one of these conversations from start to finish (45 minutes), but it did not work. The class observed, but did not participate.

The camera was focused on the professor nearly all the time. About half a dozen times it cut to the audience, which was motionless. (The image shown below was just after Professor Adelman tried to lighten the atmosphere with a quip). The fact that the speakers, not the class, are the stars of the show is emphasized by the class shots being badly out of focus.


This is not to beat up on Professor Adelman or Coursera -- we are all experimenting at this stage of the game.

Now for an example that worked well. My first MOOC experience was in 2006 when I "took" Professor Charles Nesson's Harvard Law School course "CyberOne: Law in the Court of Public Opinion." There were three groups of students -- Harvard law students who attended in a traditional lecture hall and received course credit, extension students who met in Second Life and received extension credit and the general public which followed via weekly podcasts without credit. The course wiki, student notes, lecture videos, and student projects were all online under Creative Commons license and we podcast lurkers were encouraged to participate.


I did not watch the class videos, but listened to audio recordings of the class sessions. I heard professor Nesson lecturing and leading discussions with the students in the room. He was informal and encouraged participation. In spite of the fact that I was only listening to audio recordings while working out in the gym, I became quite involved in the class. I looked forward to the podcasts, read the online material and corresponded a bit with Professor Nesson and the TA (his daughter) via email. I remember the class fondly and would say that the interaction scaled quite well.

The global dialog discussions in the history class are interesting and the participants have deep knowledge of the subject matter, but they are presentations by experts which are passively observed by an audience.

What would I do if I were teaching a MOOC?

I teach a digital literacy course, and I would run the MOOC in lockstep with an on-campus section. Like professor Nesson, I would have our instructional technology staff record every FF class session and edit and post those weekly. The FF students would be co-stars of the videos.

As I do today, I would divide FF class time between lectures based on prepared teaching modules and topical material. Let's take a quick look at both the lectures and topical material.

The lectures are based on teaching modules consisting of pre-recorded lectures (5-10 minutes plus breaks for interaction), transcripts of those lectures, and the lecture slides. I present about half of the lectures in class and assign the others for self-study.

The MOOC students could watch the in-class videos of the lectures as well as the pre-recorded videos. While the live lectures are based on the same material as the pre-recorded videos, they are not scripted. I choose different words and speak differently. Unscripted examples or ways of saying things occur to me. I respond to student questions and stop to ask questions, which the students either answer or discuss with their neighbors. The goal of the in-class recording would be to capture as much of this interaction as possible.

In addition to lecturing, I devote in-class time to topical material. I prepare weekly discussion presentations in the same format as my pre-recorded lectures. A portion of the topical material is class feedback -- common misconceptions I find while grading their weekly quizzes and assignments and the results of anonymous study-habit polls with questions like "did you review presentation X before coming to class?" I also present things that occurred to me after class during the previous week.

While some of the topical material is based on our class, most of it is triggered by current events that are relevant to the class. For example, this week (tomorrow) we will talk about the damage to the Internet due to Hurricane Sandy, the concept of fair use in copyright, triggered by a warning I just got for a YouTube video I posted, Google image search, which has been around for some time, but I had not tried until this week, the use of new media (radio, TV, the Internet) in political campaigns triggered by President Obama's holding Google Hangout and Reddit "ask me anything" sessions and the global diffusion of fourth generation cell technology triggered by a trade association report.

Could we capture the spontaneity and interaction of an excellent FF class using video recording and presentation of topical material? Probably not, but my hypothesis is that we could capture some of it -- perhaps as much as goes on in an average FF class.

Teaching assistants and I would also hold online office hours (Google hangouts) discussing topics and questions submitted by students during the hangout or before. (These office hangouts might run considerably longer than typical course office hours). Videos of the hangouts would be posted online.

There is a question of motivation in the FF class. We all know that some classes are more engaged and lively than others. If the class is to be the MOOC co-star, I would explicitly try to involve and motivate them -- to build esprit de corp and encourage active participation. I am not sure how to do that, but I would let them know they had a responsibility to speak for the MOOC students. I would also encourage communication between MOOC and FF students. I would try bribery with refreshments served during class. I would try negative reinforcement like taking class participation, attendance and participation in office hour hangouts into account in their grades. What else?

#digilit #jiscdiglit #highered #edreform #MOOC #pedagogy #EDUCAUSE #bonkopen #coursera

Friday, November 02, 2012

Internet damage from Hurricane Sandy -- the Internet senses its own failures

Still from Renesys animation
This is a follow up on yesterday's post on the Internet damage done by Hurricane Sandy. That post described data center outages and Paul Baran's 1964 RAND reports spelling out the rationale for and design of a packet switched network. It also includes a link to a terrific interview of a data center CEO who struggled to stay on-line during and after Hurricane Katrina in New Orleans.

The Internet senses its own failures in two ways, automatically and in cooperation with humans.

A network is removed from the global routing table a few seconds after it goes down. Renesys tracks the dynamic state of the Internet by monitoring that table.

This animation shows the percent of networks that are down in small geographic areas hit by the storm. Dark green indicates that at least 99.95% of the networks are up and the dark red indicates that more than 5% are unreachable.

They report that in Manhattan the typical outage rate is around 10% and point out that "silencing ten percent of the networks in the New York area is like taking out an entire country the size of Austria, in terms of impact on the global routing table." That is the bad news. The surprisingly good news is that 90% of the data centers are still up -- running on backup diesel power and caffeine.

On the right, we see another Renesys view showing network outage by state over time. As we see, New York was the hardest hit with around 1,200 networks off line at the peak, but some have come back on-line.

Renesys monitoring is automatic, but people are also monitoring the network with the aid of tools like Twitter. Andy Carvin is known for his use of Twitter and other Internet tools in producing real time news reporting (see the presentation at this location) and the same approach has been used in disaster reporting.


Using Twitter and other sources, Rich Miller of the Data Center Knowledge blog has been reporting on data center outages

J.C.R. Licklider
Internet damage is caused by flooding and power outages and the Internet is also used (by people) to report power outages. Those reports are aggregated and mapped in real time at Web sites run by companies like Con Edison, which serves Manhattan and the Long Island Power Authority. (Note the overlap between the power-outage and Internet outage maps).

Finally, let's note that the human-Internet collaboration on disaster reporting or any other task was anticipated long ago by J. C. R. Licklider who, in the 1960s, wrote of man-computer symbiosis, envisioned the Internet and was instrumental in funding much of the research that led to the Internet and modern personal computers.  (Read two of his highly influential papers here).