Monday, July 29, 2013

The largest data center in Los Angeles sells for $437.5 million or $660 per square foot

From the outside, One Wilshire looks like an ordinary office building.  About a third of the building is occupied by offices, but the rest is an Internet exchange point and data center -- full of routers and servers belonging to about 300 voice and data service providers. It is the key US hub for connection to Asian undersea cables. That makes it some of the most expensive real estate in the world,  It just sold for $437.5 million.

Conduits rising from the pavement give a clue that this is not an ordinary office building.

LA Times photo

Moving inside, we see the "meet me room," where networks exchange data. This is one of twelve images in an exhibit by the Center for Land Use Interpretation.

If you like lurking inside large data centers, check this photo tour of a Google data center.  Here s a sample image:

The Syncom satellites and Arthur C. Clarke

The LA Times ran a cool article on the development of the Syncomm satellites. Syncom 1 failed, but Syncom 2, which was launched 50 years ago on July 26, 1963, succeded in transmitting voice and image-only TV, and President Kennedy used it for a call to the Prime Minister of Nigeria in August 1963. Syncom 3, which launched August 19, 1964 was used to telecast the 1964 Olympic Games.

The Times article describes the three Syncom engineers, shown below, and their project.

The Times article relates the trouble they had selling the project, but in retrospect, it ended up being a very small investment that launched an industry and re-shaped global culture. (It was similar to the Internet in that respect). The article also contains an interactive graphic showing the satellites currently in orbit.

My only criticism of the article is that it gives the impression that the lead engineer, Harold Rosen, was the first to conceive of a geostationary communication satellite. That concept was suggested by others and fleshed out and popularized as a communication tool by the famous science fiction writer Arthur C. Clarke in a 1945 letter to the editor and subsequent article in Wireless World Magazine. This graph, from Clarke's article, shows the equilibrium altitude for a geosynchronous satellite:

So, in addition to reading the LA Times article, I'd recommend checking these out:

Time flies -- the Earth now has a fiber-optic nervous system that far outstrips communication satellite capacity. That would have been hard to predict in 1963, but if you think Arthur C. Clarke would have been surprised, read his book on the cabling of The Earth, How the World Was One.

Saturday, July 27, 2013

Chromecast blunders -- does Google need a few MBAs?

Google's new Chromecast, a device for streaming video on a TV set, might turn out to be terrific, but they have bungled the marketing and production.

I was on the Yet Another Tech Show podcast Wednesday and, based on what they had read, all the panelists agreed that Google's new Chromecast was a terrific product at a no-brainer price.  So, being a cord cutter, I ordered one.

After I placed my order, I learned that I had missed the deadline for a three-month Netflix credit premium. I was satisfied with the purchase at first, but now I am somewhat pissed at Google for giving all those folks who ordered a few days earlier a better deal.

It also turned out that Google blew the production planning -- my Chromecast is expected to ship in three or four weeks.

I came across a tweet saying that nearly 250,000 people had ordered them before the Netflix promotion was pulled -- who plans logistics and production?

I had to wait, but it will be worth waiting for, right? I hope so, but astute blogger Bob Cringley suspects that the Google demo overstated the capability of the Chromecast and when I checked the customer reviews on Amazon, I found that, while most are favorable, there is some dissent.  Some folks could not get it to work, experienced poor quality audio or video or discovered that they needed a second port to power the device if they had an older TV with pre-version 1.4 HDMI ports.

I didn't cancel my order -- I still hope to love the thing, but I can't help wondering if Google, with all their PhD engineers and scientists couldn't use a few more folks with MBAs in production management and marketing. Remember the Nexus Q?

Monday, July 22, 2013

Review of online coverage of the 2013 Tour de France from NBC and ITV4 (UK)

I watched the Tour de France online again this year. (My reporting on the Tour from last year). In order to compare NBC's coverage in the US with that of French ITV4, I subscribed to NBC's US coverage ($29) and also watched ITV4 via an English proxy server. Both used the same video feed, but differed in several ways -- lets's look at some of the main differences.

Video players and interaction

The NBC player ran in its own Silverlight window as shown here:

Zooming in on the controls, we see (from left to right) that the user can share a link on Facebook, Twitter, etc., adjust the speaker volume, rewind 15 seconds with a single click, see how far behind the live stream they are on the timeline, jump to the live stream, reveal running commentary (tweets) and links to highlight videos and go full screen.

I preferred to leave the running commentary and links to highlights hidden, but here is a screen shot with them revealed:

NBC used what seemed to be the same player in the 2012 Tour de France, and you can see a more complete description here.

The NBC experience was far better and more interactive than that of ITV4. ITV4 emulated watching TV, merely displaying a live video stream inside a Web page. As you see in this screen shot, the user could only stop/watch the live stream, adjust the volume or toggle between full screen and windowed mode.

NBC clearly has a better, interactive video player.  It may be that ITV4 just streamed the live stream because there was a large fee for the right to archive it.

Streaming performance

My laptop has 8GB of memory and a solid state drive and my cable service is typically 15 Gbps, but my three year old CPU/graphic controller is not up to the task of watching full screen video on a TV set, so I did most of my watching on the laptop screen.

When I tried to go to full screen on a TV set, the CPU fell steadily behind, whether I was watching NBC or ITV4 or using Chrome, Firefox or Internet Explorer.  NBC would begin at its maximum speed of 2.5 Mbps, but it would automatically step down to 350 Kbps, then start dropping frames.   ITV4 did not display their speeds, but they seemed to be about the same and the results were also the same.

This is a temporary issue.  A fast laptop would probably be able to keep up with either and future versions of the player software will be more efficient.

Current status presentation

As shown here, ITV4 provided a nice race status display below the video window. Scrolling down further, you saw the overall standings and standings within categories like mountain climbing and sprinting.  It was convenient and easy to access.

Since the NBC player runs in a separate window, one had to open the video player and a Web browser side by side, as shown here, to achieve the same functionality.

Aligning and sizing the two windows was a bit of a bother.  Since one could scroll down and see pretty much the same current status information on both, I'll give ITV4 the edge in convenience.

Ancillary content and archives

NBC had a lot more ancillary content than ITV4.  They had more prepared material and, since they archived the video, a much richer selection of replays.  Ancillary content was organized as shown in the tabs below.  For example, the Track tab showed the positions of the groups of riders on the course on a Google map.  

NBC easily wins the archive comparison. They saved preview videos and full stage replays for each stage along with 213 stage highlight videos and 37 historical and human interest "tour extra" videos. ITV4 archived only a three minute highlight of each stage and ten short tour extras.  

That is the good news on NBC. The bad news is that they deleted last year's archive shortly after the race and will probably do the same this year. The BBC and NBC also deleted their 2012 Olympic archives.

Amazon taught us the value of "long tail" content years ago, but these guys don't get it. The cost of keeping an event archive online is very small and they are erasing history.

I guess it comes down to money -- couldn't they cover the the costs with ads or access fees? Perhaps the problem lies with the Tour organizers.  I assume NBC had to pay a fee for the right to archive the live stream, which was used to provide the ancillary material and the interactive player described above.  If that is the case, the Tour should keep the archives online.


ITV4 drove me nuts with commercials.  (Since I cut the cord a few years ago, I have really come to hate commercials).  You see the same ads every time you load their Web site.  I was surprised to see that the Nintendo Wii was a major sponsor in Europe, but I really got tired of seeing the same stupid ad.

But, that was minor compared to their commercial breaks during slow times in the race.  I counted one time -- they ran nine straight commercials without a stop, which seemed interminable.  (Since both NBC and ITV4 share a video stream, NBC cut to historical or informative videos during the long commercial breaks).

NBC had fewer ads.  Like ITV4, they played two commercials when you first came to the site.  One was a Microsoft ad favorably comparing a Surface tablet to an iPad.  It made the Surface look good, but, after seeing the same ad over and over, it became an irritant.

NBC also had banner ads above the video screen at times, but they were less obtrusive and frequently turned off.  I hardly noticed them, which also makes me wonder how effective they are.

The ads were much easier to take on NBC, but they were irritating, especially so in view of the fact that NBC charged $29 for the event.

That's my summary of the experience.  It is clear that both the technology and the business model are evolving, but the bottom line for me is that I prefer to pay NBC $29 to reduce the number of commercials and to get interactivity and archiving.  Being able to pause, have breakfast and resume where I had left off was enough to sell me.

Friday, July 19, 2013

Low pass rates in the San Jose State/Udacity experiment, but is pass rate a good metric?

Udacity, the MOOC platform company, plans to experiment with on online MS in partnership with Georgia Tech and they have also tried offering a few courses for credit in partnership with San Jose State University (SJSU). The Georgia Tech experiment is just starting, but we have some preliminary results from SJSU.

I got a copy of a portion of a presentation on the SJSU-Udacity experiment. As you see below, they compare pass rates of the Udacity sections with the traditional classroom sessions. The Udacity results are disappointing, but I think "pass rate" is an outdated, pre-Internet metric for these courses. More on that later, but first, here is the presentation excerpt:
In Spring 2013, San Jose State University (SJSU) collaborated with Udacity - a for-profit online start-up - to offer basic Mathematics and Statistics classes. The Udacity leadership appeared in a news conference with SJSU President Mo Qayoumi and Governor Jerry Brown to tout this public-private partnership as a means of increasing both access and graduation rates at SJSU. This same Udacity leadership appeared with State Senate Pro-Tem Darryl Steinberg in the rollout of SB520 and in the presentation of SB520 to the Senate Higher Education committee where they described the collaboration as part of their vision for SB520.

As part of this "experiment" at SJSU, success rates comparing SJSU students in the online version of the three courses versus SJSU students enrolled in the traditional face-to-face (F2F) versions of the same courses were collected. In addition, there were some non-SJSU students also enrolled in the Udacity online courses. Below are the preliminary results of this experiment.

MATH 6L: Remedial/Developmental Math:
Udacity online version: 29% pass rate (14/49 passed; 2 withdrew)
Face-to Face version: 80% pass rate
Non-SJSU students in Udacity version: 12% pass rate (6/50 passed; 13 withdrew)

MATH 8: College Algebra:
Udacity online version: 44% C-pass rate (8/18 passed; 2 withdrew)
Face-to Face version: 74% C-pass rate
Non-SJSU students in Udacity version: 12% C-pass rate (8/67 passed; 20 withdrew; 17 = Unauthorized Withdrawal WU)

STAT 95: Intro to Statistics:
Udacity online version: 51% C-pass rate (19/37 passed; 1 withdrew)
Face-to Face version: 74% C-pass rate
Non-SJSU students in Udacity version: 47% C-pass rate (21/45 passed; 8 withdrew; WU = 9)
The class sizes were small (these were not MOOCs) and the pass rates disappointing, but I am sure they learned from the experience. They are currently offering five classes, and those might yield better results.

But, is pass rate a reasonable metric of success in the Internet era? In my opinion, the notion of "passing" with a C, whether face-face or online, is a flawed metric of success for many courses. As I have said many times before, it is like putting old wine in a new bottle -- using new technology to mimic the past.

If you got a C in an introduction to statistics, you did not understand a lot of what was taught. Maybe you understood descriptive statistics, but not hypothesis testing. Instead of one grade, I'd prefer a fine-grained grading system in which one could, for example, pass "measures of central tendency," then "measures of variability," then "basic probability," etc. In that case, "passing" an introduction to statistics would mean passing a series of ordered modules and understanding all of the concepts and skills presented in the course.

I've been advocating and using modular material for many years, but always within the confines of the standard grading paradigm -- assign a letter grade from A to F for an entire course. With today's technology, we could combine modular teaching material with pass/fail grading at the module level. The technology is the easy part. Breaking up the traditional transcript will be tough.

Tuesday, July 16, 2013

Vint Cerf -- a concise history of packets, the ARPAnet and the Internet

Conceptual sketch of the ARPAnet by Larry Roberts
If you are interested in the history of the Internet and only have 16 minutes to spare, watch this interview of Vint Cerf, who was one of the handful of people that created the Internet.

Cerf's narrative begins with the idea of packet switched communication and runs through the creation of the ARPAnet, followed by the invention of internetworking protocols to link three disparate networks -- the ARPAnet, a mobile communication network and a satellite communication network.

This short video is like an annotated table of contents of the early history of the Internet. Cerf introduces us to Leonard Kleinrock, Paul Baran, Donald Davies, Larry Roberts, Thomas Marill, J. C. R. Licklider, Doug Engelbart, Norman Abramson, Robert Taylor, Charles Herzfeld, Steve Crocker, Jon Postel, Bob Kahn, David Reed, Danny Cohen and Bill Joy, summarizing the work of each and putting it in context.

Whether this is all you want to know about the history of the Internet or you want to use it as a jumping off place to learn more about the contributions of these people, this is a good place to start.

Cerf also conveys a sense of common purpose among those pioneers. He does not say so, but one can think of them as working together to realize Licklider's vision of a network running Engelbart's applications. Cerf makes it clear that, although they worked for several different organizations and changed jobs from time to time, these people knew each other well and collaborated closely on creating the Internet. (For example, Cerf, Postel and Crocker went to the same high school and studied under Kleinrock as graduate students at UCLA). The group also had an excellent collaboration tool -- they were the first users of the networks they built.

The ARPA/Internet project was a great example of government as a non-equity angel investor -- providing a small bit of seed funding ($124 million) to a group of smart, dedicated people, rather than setting up a department to do the work internally.

Charles Severance of IEEE Computer Magazine conducted the interview of Cerf, and it is one in a series of computing conversations.

Saturday, July 13, 2013

A provocative column on Georgia Tech's $7,000 MS in computer science

Astute industry analyst Robert Cringely says that Georgia Tech's $7,000 online MS in computer science is watered down, will cheapen the Georgia Tech brand, will earn Georgia Tech a lot of money, and may be the future of education.

The self-paced (typically three year) program is being developed with Udacity with the help of a $2 million grant from AT&T. They will start with 300 students, many of them AT&T employees, and hope to expand to 10,000 students while hiring only eight new instructors.

Cringley sees this as a recipe for a "crappy" degree, but says it will make a ton of money because professional degree students typically pay for their education while research students provide cheap teaching and research labor funded by grants. He also sees the Georgia taxpayer subsidizing offshore students -- as Cringely puts it "programmers in Bangalore will soon boast Georgia Tech degrees without even having a passport."

It is noteworthy that the courses will be offered as free MOOCs for those not seeking a degree -- only enrolled, degree-seeking students will pay and only they will get tutoring, online office hours, proctored exams, etc. We have talked of the importance of the non-degree MOOC audience in an earlier post. The business model here seems to be counting on for-credit students paying the cost of production, with a by-product of high-quality, free MOOCs.

What do you think? Is programmers in Bangalore getting low-cost degrees a bug or a feature? Will students pay $7,000 for certification from Georgia Tech when they can get the same content in a free MOOC? Will the free MOOCs turn out to be the most important part of this experiment -- particularly in developing nations? How will prospective employers value the credit and non-credit completion of the courses? Can eight faculty adequately serve 10,000 students? Will other universities follow suit?


Update 7/15/2013

Laura Gibbs pointed me to an in-depth discussion of the Georgia Tech MS CS program. Christopher Newfield challenged the economic projections in the Georgia Tech contract, and Udacity co-founder Sebastian Thrun answered in a blog post to which Newfield replied.

Thrun said a couple of things that caught my eye. One was that they predict that the majority of the income will come from non-degree students. We've written about the non-degree student market being potentially much more lucrative than that for degree programs. This venture will provide data on that hypothesis, but it will be a while before we know the results.

Thrun also said that data on the for-credit collaboration between Udacity and San Jose State University (SJSU) had been collected and would be released in a few weeks (from June 24). That should shed some light on the disagreement between Thrun and Newfield in spite of the fact that the students and the introductory undergraduate courses in the SJSU trial differ significantly from those of a graduate computer science degree at Georgia Tech.

We are just starting to innovate in online education after years of textbook facsimiles. Georgia Tech, SJSU and Udacity are experimenting with new models of certification and education financing, as opposed to teaching material and pedagogy. These are early, important experiments.


Update 7/18/2013

SJSU Provost Ellen Junn reported that students in three online classes did significantly worse than those in conventional classes.  A preliminary presentation showed that 74 percent or more of the students in traditional classes passed, while no more than 51 percent of Udacity students passed any of the three courses.  Junn emphasized that the results were preliminary and they plan to start working with Udacity again in spring 2014.


Update 7/29/2013

Slate takes a look at Georgia Tech’s Computer Science MOOC -- says it could change American higher education. - Slate Magazine (

Update 8/18/2013

The New York Times says the masters degree is the "new frontier" of study online (  They go on to highlight the Georgia Tech offering and discuss the future of MOOCs more broadly. Enthusiasts and skeptics are quoted. I've also revised the original post -- stressing the inclusion of free MOOCs for non-credit students.

Update 12/13/2013

Georgia Tech designs its Udacity pilot to avoid failure.
They distance the themselves from Udacity's debacle at San Jose State University.

Thursday, July 11, 2013

Four ways we can experiment with MOOCs -- Blackboard joins the fray

Provosts of 13 universities recently announced that they would be working together to take advantage of "new technologies and course redesign" to "improve instructional quality, enhance student learning outcomes, and extend the reach of campus instructional offerings."

It sounds like they want to keep open the option of remaining independent of the currently-dominant, well-funded, expensive MOOC platforms Udacity, Coursera and edX. How might a university do that? There are at least four alternative platforms, two of which are offered as hosted services:
  • Blackboard just announced that they will be hosting a new MOOC platform, which would be available free to existing Blackboard customers.
  • Blackboard competitor Canvas has a hosted MOOC platform that allows teachers to build modular courses with video lectures, quizzes, analytics, groups (inside the system or using external resources like Google Docs or Skype), etc.
There are also two open source platforms, which could be hosted by a university or other organization:
I do not know of any sites hosting Course Builder or EdX, but would not be surprised to see some in the future. For example, I can imagine (wish for) Google integrating Course Builder with some of their other services -- Docs, YouTube, Plus with hangouts on air, and Groups -- and offering a significant platform for developing and delivering courses.

We need these and other do-it-yourself alternatives to the major MOOC platforms -- the industry will eventually consolidate, but it is too soon to do so now. In the meantime, let a thousand flowers bloom.



In this interview, Jay Bhatt, Blackboard CEO says they will up spending on software development and sees MOOCs as one point on a contiuim, with support for on campus degree programs at the other. He welcomes competition from Google and others as it will push the entire industry to improve.

Monday, July 08, 2013

OECD study: investment in tertiary education pays for individuals and the public

The Organisation for Economic Cooperation and Development (OECD) has published Education at a Glance, 2013, which provides data on the structure, finances, and performance of education systems in more than 40 countries, including OECD members and G20 partners.  The title says "at a glance," but the report contains many education-related indicators along with analysis.  The OECD has also prepared detailed country profiles, and video, slide and Prezi presentations.

There is way too much to summarize here, so let me show you just one example -- the private and public returns on tertiary education for men.  (They also report on women).

The private returns on investment in tertiary education are substantial.  Not only does education pay off for individuals, but the public also benefits in the form of greater tax revenues and social contributions.  The following table shows the public and private gains associated with a man attaining tertiary education (2009) as compared with upper secondary or post-secondary non-tertiary education.

The net public return on investment for a man with tertiary education is over US$100,000 across OECD countries – almost three times the amount of public investment (direct cost plus foregone tax on earnings). For a woman, the public return is around US$60,000, almost twice the public investment.  Note that the public return for the United States is second only to that of Hungary.

The report includes the data in tables and downloadable spreadsheets and spells out the methodology and definition of indicators in detail.  Here, for example, are the indicators leading to the table shown above:

Most of this data is from 2009, so critics might counter that the cost of education has risen substantially since that time and the payoffs have dropped, and they would be correct.  On the other hand, we are experiencing a spurt of innovation in online education.
In the nations surveyed, education paid off handsomely for individuals and the public.  My guess is that, as we improve online education and begin to define its role in certification and employment, the payoff from education will grow.
If this report sounds interesting, you can download a copy  here.  Note that it includes links to Excel spreadsheets with all of the raw data, so you can draw your own conclusions. 

Wednesday, July 03, 2013

Doug Engelbart passes away

If you use a mouse, hyperlinks, video conferencing, WYSIWYG word processor, multi-window user interface, shared documents, shared database, documents with images & text, keyword search, instant messaging, synchronous collaboration, asynchronous collaboration -- thank Doug Engelbart.

Then go watch The Demo, where Doug unveiled them to the world:

I have a teaching module with a presentation that puts Doug's enormous contribution in context.

I had the privilege of visiting Doug's lab when I was a graduate student working on interactive data mining. Two researchers, G. H. Ball and D. J. Hall were already using Doug's equipment for data analysis. He was open, idealistic, optimistic and far ahead of his time.

Here is a picture of Doug delivering The Demo:

(I added the chord keypad to let you all know that Doug was not infallible :-).

Here is the abstract for The Demo at the 1968 Fall Joint Computer Conference (JCC). The Fall and Spring JCCs were major conferences at the time and Doug Spoke before a crowded auditorium. It was the most important presentation in the history of computer science.

One reason Doug and his team were so productive and successful is that they used the excellent collaboration-support tools they were building to support their own work. Doug emphasized the importance of their use of their own tools -- as they say "eating their own dogfood." The team used shared documents and databases as well as synchronous collaboration tools like what must have been the first computer-equipped meeting room:

People insist on summarizing Doug's contribution as being the "inventor of the mouse." That is an extreme understatement of the importance of his work and thinking, but I guess I have to say something about the mouse. Here is a figure from his patent application:

Today we all use tools derived from his lab prototypes.  Could an economist estimate the contribution to GDP of those tools?

The following are two famous images Doug used to illustrate his work. Unlike Steve Jobs, Doug believed that if a tool were sufficiently powerful, people would invest the time to learn to use it, so simplicity and being similar to familiar tools from the past was not necessary. We would not build a racing tricycle for high speed travel, but would invent something new, the modern racing bike:

He saw the computer as a companion tool that would augment human intelligence. To make that point, he taped a pencil to a brick to show the de-augmentation.:

The Demo and these images were crystal clear -- you "got it" as soon as you saw them. On the other hand, Doug's writing could be a bit hard to follow -- he worked within a theoretical framework which he expanded upon throughout is career. Check this defining document on his Conceptual Framework for Augmenting Human Intellect.


Update 8/1/2013

I've put together a 17-image photo tribute to Doug Engelbart's work -- cool images with captions and links to more.


Update 8/3/2013

Just as The Demo showed the ways in which computers could improve productivity, a demonstration of applications of the ARPANet illustrated the potential of computer networking at the 1972 International Conference on Computer Communication. Attendees could interact with networked applications using 40 terminals. The demonstrations spread interest in network applications beyond the small group that had designed and built the ARPANet.

This ARS Technica article describes the conference and demonstrations. Much of the article is excerpted from the book Where Wizards Stay Up Late by Katie Hafner and Matthew Lyon. If you like the article, you'll want to read the book.