Saturday, July 27, 2013

The MOOC That Roared

The MOOC That Roared
How Georgia Tech’s new, super-cheap online master’s degree could radically change American higher education.
Slate, July 23, 2013

Georgia Institute of Technology is about to take a step that could set off a broad disruption in higher education: It’s offering a new master’s degree in computer science, delivered through a series of massive open online courses, or MOOCs, for $6,600.

The school’s traditional on-campus computer science master’s degree costs about $45,000 in tuition alone for out-of-state students (the majority) and $21,000 for Georgia residents. But in a few years, Georgia Tech believes that thousands of students from all over the world will enroll in the new program.

The $6,600 master’s degree marks an attempt to realize the tantalizing promise of the MOOC movement: a great education, scaled up to the point where it can be delivered for a rock-bottom price. Until now, the nation’s top universities have adopted a polite but distant approach toward MOOCs. The likes of Yale, Harvard, and Stanford have put many of their classes online for anyone to take, and for free. But there is no degree to be had, even for those who ace the courses. Education writer and consultant Tony Bates recently noted that until top institutions begin putting a diploma behind their MOOCs, “we have to believe that they think that this is a second class form of education suitable only for the unwashed masses.”

While many universities now offer online degree programs (which don’t have the massive numbers of students as MOOCs), they are almost always priced at the level of their traditional on-campus programs. George Washington University’s online MBA Healthcare degree, for example, costs the same $1,485 per unit (52.5 units gets you to the finish line) as the standard program. The reasons for this are many, but perhaps the most important is that universities are terrified of debasing the value of their diplomas.

Drop the price of the online degree, the logic goes, and you could have a Napster-like moment sweeping college campuses. Revenues spiral down as degree programs are forced to compete on tuition. That’s a terrifying prospect for universities, which have depended on steadily rising tuition—growing at more than twice the rate of inflation—to cover costs.

Georgia Tech’s new program, though, throws a monkey wrench into the system by reordering the competitive landscape. U.S. News & World Report ranks the computer science department among the nation’s top 10. The new degree—which is a partnership with MOOC pioneer Udacity—is intended to carry the same weight and prestige as the one it awards students in its regular on-campus program.

John Backus, the chief executive of Atlantic Ventures, which invests in a number of higher-education companies, asks: “Why would you go to XYZ college, pay three to four times the amount, when you can get a master’s degree more cheaply and from a better school?”

Across town from Georgia Tech, Emory University, with a less prestigious computer science department, offers a master’s degree for around $40,000 in tuition. Plenty of other mid-tier schools also could find themselves struggling to justify their high sticker price. A year of tuition at Boston University’s graduate school costs in excess of $43,000; Washington University in St. Louis is $44,000. Living expenses and other fees can drive those figures up even more. At most programs, Georgia Tech’s included, a few master’s students get teaching or research fellowships, but financial aid opportunities are usually hard to come by. Many master’s students finance their tuition by taking on debt.

There are still a lot of questions about whether an online program with thousands of students can measure up to a campus-based program that enrolls about 130. Chip Paucek is the founder of 2U, a firm that provides online course platforms to universities, such as Georgetown University's Master of Nursing program. He insists that even with an online program, you can’t just add students without compromising quality. “Nearly 1,000 students receive live training each week in classes of 10–15 students,” he says, adding, “Could we reduce the cost of instruction by doubling or tripling the size of the live classes? Sure. But we are confident that doing so would undercut the learning experience.”

Even Zvi Galil, the head of Georgia Tech’s school of computing who is launching the new program, is wary. “This is uncharted territory,” he says. But, he warns, if Georgia Tech doesn’t do this someone else might come along and do it first—grabbing the notoriety, the students, and the revenue. “There is a revolution. I want to lead it, not follow it,” he says.

The idea for the program first came about last fall when Udacity founder Sebastian Thrun met up with Galil, and, with typical Silicon Valley swagger, proposed creating an online master’s program for $1,000.* The pair got to work on it. Both Thrun and Galil recite the well-worn MOOC talking points. “How can we reach people we can’t reach today?” asks Thrun. “Online is one way. Price is another.” They also say that offering a computer science degree on a massive scale can help alleviate the demand for talent in the job market.

Currently, Georgia Tech receives about 1,400 applications for its computer science master’s program and accepts less than 15 percent. Galil insists that more than half of the applicants, however, could actually handle the work. He says there is no reason why the online program, by its third year, shouldn’t be able to admit all the applicants who meet the minimum standards. Many applicants are from overseas. With the new MOOC-style program, hurdles such as obtaining student visas melt away.

As currently configured, MOOCs have plenty of failings. According to one recent study, the completion rate is generally under 10 percent. Cheating can be rampant. Neither of those would do for a program that costs money and delivers a prestigious degree. So Thrun and Galil began building what they call “MOOC 2.0.” That means baking in a lot of human oversight and interaction into the system. It includes student advising, some tutoring, and even the hectoring to turn in homework on time that happens in a traditional course.

However, another consequence of Georgia Tech’s exercise is to lift the kimono on the crude math that universities use to figure out what to charge. (The site Inside Higher Ed obtained documents that outline the cost breakdown and did an analysis.) A $2 million initial gift from AT&T—which plans to enroll some of its employees—helped defray the start-up costs. That means that even with the bargain-basement tuition, both Georgia Tech and Udacity expect to split a modest surplus of $240,000 from year one.

As the program scales up, the costs increase, but the anticipated revenues grow faster. By the third year enrollment is estimated to hit more than 2,000, more master’s students than the department has graduated in the past 20 years. That year, costs are projected to be $14.3 million, revenues $19.1 million.

Georgia Tech’s current on-campus program also brings in revenue that exceeds its costs, though the university says it is difficult to put a hard figure on how much. In recent years, universities across the county have rolled out new master’s programs—particularly in subjects that are required for state licenses, such as teaching or psychology. Often, these turn out to be cash cows because of the high tuition and low levels of university financial aid. But Georgia Tech’s MOOC master’s, with its ability to scale up dramatically, could potentially dwarf the returns the university has seen previously.

“It’s a complicated formula,” says Thrun. “The bulk of the costs are the face-time. The platform itself is the cheapest component.”

Some of those costs have big implications for the campus pecking order: The plan calls for hiring course assistants who aren’t drawn from the ranks of graduate students. Instead, they will be people with some expertise in the subject matter. Galil said that some could even be recent graduates of the online program; others might not have master’s degrees.

As Georgia Tech moves ahead, one of the first casualties may be the university’s own on-campus computer science master’s program. Charging an out-of-state student $45,000 to show up to class on time might be a tough sell when the same degree can be had for a fraction of the cost.

Galil guesses that some students might be willing to pay extra for the in-person contact or the chance to connect with corporate recruiters visiting campus. But he assumes that many others won’t. The on-campus program could shrink, he says, but adds, “So what?”

Monday, July 22, 2013

Microsoft Research Summit Impressions

This week I attended the Microsoft Research Faculty Summit, an annual event held at MSR Redmond. The 2013 event gathered over 400 faculty from around the world. I was honored to receive an invitation, as these invitations are competitive: MSR researchers recommend faculty to invite, and a committee at MSR selects a subset who receive invitations.

Below are some of my impressions from the event. But, before I go on, I  first wanted to thank MSR researchers John Krumm, Ivan Tashev and Shamsi Iqbal for spending time with me at the summit.  Thanks also to MSR’s Tim Paek, who has played a key role in a number of our studies at UNH.

Bill Gates inspires
Bill Gates was the opening keynote speaker. He discussed his work with the Gates Foundation and answered audience questions. One of the interesting things from the Q&A session was Bill’s proposed analogy that MOOCs are similar to recorded music: in the past there was much more live music, while today we primarily listen to recorded music. In the future live lectures might also become much less common and we might instead primarily listen to recorded lectures by the best lecturers. While this might sound scary to faculty, Bill points out that lectures are just one part of a faculty member’s education-related efforts. Others include work in labs, study sessions, and discussions.

MSR is a uniquely open industry lab
While MSR is only about 1% of Microsoft, it spends as much on computing research as the NSF. And most importantly, as Peter Lee, Corporate VP MSR, pointed out, MSR researchers publish, and in general conduct their work in an open fashion. MSR also sets its own course independently, even of Microsoft proper.

Microsoft supports women in computing
The Faculty Summit featured a session on best practices in promoting computing disciplines to women. One suggestion that stuck with me is that organizations (e.g. academic departments) should track their efforts and outcomes. Once you start tracking, and is creating a paper trail, things will start to change.

Moore’s law is almost dead (and will be by 2025)
Doug Burger, Director of Client and Cloud Applications in Microsoft Research’s Extreme Computing Group, pointed out that we cannot keep increasing computational power by reducing transistor size, as our transistors are becoming atom-thin. There’s a need for new approaches. One possible direction is to customize hardware: e.g. if we only need 20 bits for a particular operation, why implement the logic with 32?

The Lab of Things is a great tool for ubicomp research
Are you planning a field experiment in which you expect to collect data from electronic devices in the home? Check out the Lab of Things (LoT), it’s really promising. It allows you to quickly deploy your system, monitor system activity from the cloud, and log data in the cloud. Here’s a video introducing the LoT:

Source: Andrew Kun, University of New Hampshire


Bill Gates @ Microsoft Research Faculty Summit

Bill Gates says that this is the “golden era” of learning, thanks to massive open online courses and easy access to information. The chairman of Microsoft gave the keynote address on Monday at Microsoft Research’s Faculty Summit, an annual event that brings together Microsoft researchers and academics from more than 200 institutions for a two-day conference in Redmond, Wash., on current issues facing computer science.

At the summit, Mr. Gates told the audience that he sees enormous potential for MOOCs but cautioned that online education still faces many challenges. He also talked briefly about online education during an question-and-answer session with the audience.

In the Q&A, Mr. Gates predicted that MOOCs would not be “place-based” classes but would be led by a small subset of instructors who taught to a broad audience. There’s a set of people, he said, who are really good at it and who have big budgets and great support.

Mr. Gates said online education is “certainly going to happen” but emphasized that there’s more to learning than simply lectures. He cautioned that it was important not to forget the value of in-class learning, especially in lab work and in collaborative study groups.

He added that the high dropout rate of students enrolled in MOOCs has yet to be dealt with. In some instances, he said, the for-profit sector of education is doing a better job of tackling the issue through its intense support of students. As an example, he cited how for-profit learning companies will contact students if they are not present for a class or Webcast. He called that kind of hand-holding a “best practice.”

Mr. Gates also addressed the current discussion over the value of a college degree, predicting a “decoupling” of the degree from knowledge acquisition. Traditionally, a college degree was a badge indicating skills in certain areas that could be translated to employment. Mr. Gates said that may no longer be the case, largely because of online education.

This will be “a global phenomenon,” he said. “We’re on the beginning of something very profound.”

Source: Ms. Grossman, Chronicle of Higher Education

Saturday, June 29, 2013

Summer Programming Camps for Kids

The video game Jacob Asofsky is creating is simple: "Someone who is trying to take over the world and you try to stop them."

The 12-year-old from Florida is spending two weeks at a summer camp in a program that teaches programming skills to young people.

"It's about having fun, but it also gives them the tools to be able to do this at home because they don't have this in school," said Taylor Jones, director of the iD Tech Camp at Atlanta's Emory University.

So-called coding camps for children are becoming more popular amid a growing effort to expand access to computer programming and inspire more youths to seek computer science degrees and careers in technology. Their rise underscores a seeming mismatch in the U.S. economy: people like Facebook CEO Mark Zuckerberg and Tumblr founder David Karp illustrate the opportunities programming skills can create, yet universities are not graduating enough code-savvy students to meet employers' demands.

The iD Tech Camps, which have grown from 200 students in 1999 to 28,000 enrolled this year in courses at dozens of locations nationwide, use interest in gaming to build bridges to computer programming and hopefully careers in Web developing, film animation and app creation for smart phones. Courses start at $829 for a one-week course during the day with overnight students paying $1,348.

On a recent weekday, Asofsky was attending an iD Tech Camp on the campus with some 95 other youths under the age of 17. He was using the gaming software RPG Maker to create a video game in which the main character travels around the world, buys animals and armor and interacts with others along the way.

"I have to say the interface of actually making a game is just as fun as playing a game," Asofsky said. "It's a lot like playing a game inside a game."

Early courses for children starting at age 7 use the photo and illustration software Adobe Photoshop and the gaming software Multimedia Fusion to create a simple arcade-style game.

"We sit down and talk about what makes games fun," said instructor Melissa Andrews, who was working with the youngest group of campers. "We get it down to the basics so they can make their own game."

Courses for older children include designing apps, creating sophisticated, 3-D, first-person shooter games using the Unreal Developer's Kit - also known as UDK - and learning programming languages like Java and C++. The idea is to build self-confidence and spark interest in learning how computers work, all to perhaps plant the seed of a future career in programming.

There will be 1.4 million computing jobs by 2020 but only 400,000 computer science students by that time, according to Code.org, a nonprofit with a list of who's who in the tech world on its advisory board including Twitter creator Jack Dorsey and Dropbox CEO Drew Houston.

And the jobs pay well. The median annual wage for a computer programmer, for instance, was $71,380 in 2010, according to the Bureau of Labor Statistics. Meanwhile, jobs for network and computer systems administrators are growing at double the national average, with a median annual salary of $69,160.

Yet high schools and universities seem to be out of step with the job market. Nine out of 10 high schools don't offer computer programming classes and the number of students graduating from college with a computer science degree is down from a decade ago, according to Code.org.

Earlier this year, President Barack Obama said programming should be a required course in high school, similar to foreign languages.

"Given how pervasive computers and the Internet is now and how integral it is into our economy and how fascinated kids are with it, I want to make sure they know how to actually produce stuff using computers and not just simply consume stuff," Obama said during a Google+ Hangout.

Yale Oseroff's high school back in Virginia doesn't offer programming classes. The 17-year-old is spending his fourth year at an iD Tech Camp working through C++, a popular programming language used for systems and application software, for drivers to communicate between an operating system and devices like printers and to create some video games.

"I'm learning (computer) networking, which is what I want to do in college," he said, as he worked on developing a program to capture usernames and passwords and store them in a database.

On the Georgia Tech campus, the Institute for Computing Education offers a variety of camps clustered into elementary, middle and high school groups. Courses include making apps with App Inventor, creating moving sculptures with the WeDo Robotics systems that uses rotational motion and creating animations using Alice software.

Barbara Ericson, director of computer outreach at the Institute for Computing Education at Georgia Tech, said people sometimes ask: why not wait until children are older to start teaching them how to program?

"Anything over the age of 7 is capable, they are capable of learning reasoning," she said.

During a presentation earlier this month at a technology conference in Washington, D.C., Code.org founder Hadi Partovi said less than 5 percent of U.S. high school students spend class time learning computer science while it's a graduation requirement in China. He noted that many "software" jobs are outside the tech industry such as banking, retail, government and entertainment, which makes programming skills particularly versatile.

"It could mean starting your own company," he said. "But it could mean you're a doctor and you're tired of entering the same data into a chart using paper and you want to write an app that does it for you."

Source:
Source: Christina Cassidy, Associated Press

Tuesday, April 23, 2013

A Revolution in Basic Education

Sal Khan has a simple mission: a free, world-class education for anyone, anywhere. Naturally, people think he's crazy. The craziest part is not the "world-class education" part, because plenty of people want that. And it's not even the "for anyone, anywhere" part. It's the "free" part.

Crazy or not, it's an idea that has attracted attention from Downing Street to Washington DC. And like a lot of crazy ideas, it started by accident.

Khan – working as a financial analyst in 2004 after earning degrees from MIT and an MBA from Harvard – started remotely tutoring his cousin, Nadia, in Louisiana, who was struggling with maths. "Then the rest of the family heard there was free tutoring," he says, and more relatives started taking part. The demands got too much – until a friend suggested he could film the tutorials, post them on YouTube and let the family members view them whenever they chose.

"YouTube? YouTube was for cats playing the piano, not serious mathematics," Khan recalls thinking. "I got over the idea that it wasn't my idea and decided to give it a shot."

Since 2009, Khan has devoted himself full-time to his Khan Academy, a tutoring, mentoring and testing educational website at khanacademy.org that offers its content free to anyone with internet access willing to work through its exercises and pithy videos, the majority narrated by Khan himself.

"It's no exaggeration to say that there's a revolution coming in education, sparked by Sal Khan," says Rohan Silva, a Downing Street senior adviser on technology.

Using the internet to widen access to education is not itself revolutionary. The success of iTunes U applications from Apple and the rise of the massive open online courses – nicknamed Moocs – at institutions such as Stanford University show the appetite is there.

But the Khan Academy is different. Although it also carries tutorials in arts, computing and science, its core remains secondary school maths, in which it couples hand-holding video instruction with online exercises, from basic addition and multiplication to the farther reaches of algebra and calculus. There's no accredited qualifications, just a self-paced course combined with sophisticated software that charts progress and highlights weaknesses, making it simple for a parent to use to help a child with homework without knowing the finer points of algebra.

 A tutorial from the Khan Academy website called Why Distance is Area under Velocity-Time Line There's an easy way to see what the fuss is about: the website allows anyone to sign up and start viewing tutorials and taking the interactive tests that are at the heart of the academy's method.

The concept is simple: watch a video in which Khan explains the subject being learned, and then take the online tests that follow. The software times answers as well as noting missteps, offers encouragement for doing well or even just persevering. Then, when satisfied you have mastered the topic, it invites you to move on to a related topic.

In a classroom setting, students can move at their own pace, and make repeated viewings of tutorials if they don't understand first time around. The teacher can track progress on their own laptop, and intervene to give an explanation when they see a student struggling.

"I started this out as a hobby," Khan, 36, told a packed theatre at the London School of Economics last week, after the inevitable question about his academy's not-for-profit status. "Although my friends in Silicon Valley were quick to ask me: 'What's your business model?' I said: 'I don't have a business model.'"

A slight figure dressed in American smart-casual that belies teenage years playing heavy metal, the 36-year-old owes a lot to the transformative power of education. The child of Bangladeshi and Indian immigrants, he was born in America's deep south. "Louisiana was as close to south Asia as the United States could get: it had spicy food, humidity, giant cockroaches and a corrupt government," Khan writes in his book, The One World Schoolhouse.

Business model or not, what the Khan Academy does have is a lot of fans, and about 6 million regular users a month, not to mention tens of millions of viewings of its 4,000 online tutorials. When Khan asks the audience at the LSE how many use his site, perhaps half raise their hands, some sheepishly
.
From the beginning Khan made a deliberate decision to stay out of camera, and in the early days used relatively crude drawing software on a black background to mimic a blackboard. Khan's cordial tone and mellifluous voice combine to make the tutorials approachable and, thanks to a 10-minute time-limit initially imposed by YouTube, concise.

Word spread that there were tutorials in basic maths freely available. Khan began to notice appreciative comments appearing under his videos. Then letters started arriving. One that grabbed his attention was from a woman with two disabled children who had used the videos. "She said that her entire family prays for my family every night," Khan relates. "To put that into context, at the time I was working for a hedge fund."
More importantly, the US educational establishment was taking note, starting with the Los Altos school district in California, which began a pilot programme using the Khan Academy's tools. The initial results were spectacular, in Khan's telling, and since then the movement has spread. With backing from supporters the academy has been training teachers to use its tools and methods in places such as Idaho and Kansas.
Has he had any discussions about the Khan Academy partnering with schools in the UK? "Nothing formal now but we would like to," Khan replies.

By 2009 Khan decided to leave the hedge fund for what he describes as a closet in his house in Mountain View, California – although American ideas of what constitutes a closet are more generous than in Europe. Paying the bills was a problem until a local philanthropist, Ann Doerr, made the first significant donation, first of $10,000 (£7,000) and then $100,000, after she realised Khan was in effect working for nothing.
It was Doerr who tipped Khan off about a turning point for the academy. At the Aspen Ideas Festival – a US version of Davos – in 2010, Bill Gates told an audience that he had been using Khan's videos to teach his own child. "It actually made me a little nervous," says Khan. "It was a video made for Nadia, not Bill Gates."

Soon a call came from Seattle for a meeting with Gates himself, and the Bill and Melinda Gates Foundation became one of the academy's biggest supporters. "I've used Khan Academy with my kids, and I'm amazed at the breadth of Sal's subject expertise and his ability to make complicated topics understandable," Gates wrote in tribute to Khan being named one of Time's 100 most influential people of 2012. Gates concluded: "He started by posting a math lesson, but his impact on education might truly be incalculable."

Google and others have since got on board, as the honours mounted up for Khan. Earlier this year the Mexican billionaire Carlos Slim – said to be the world's richest man – announced his foundation was funding the translation of the academy's work into Spanish, and the academy says its material is used in classrooms around the world, including Mongolia.

There are critics who have challenged the Khan Academy's content and Khan's approach. Some have posted detailed criticisms of his tutorials, although Khan says he is happy for the scrutiny. "It'd piss me off too if I had been teaching for 30 years and suddenly this ex-hedge-fund guy is hailed as the world's teacher," he told Time last year.

Khan is no fan of traditional education, which he derides as "lecture, homework, lecture, homework". "The real problem is that the process is broken," he tells his LSE audience, to nods of approval. "We identify the gaps [in children's knowledge], then we ignore them."

(From: Richard Adams of Guardian Newspaper in London)

Thursday, April 18, 2013

Data Scientist: Best Job Opportunity in the Future

HARVARD BUSINESS REVIEW calls data science “the sexiest job in the 21st century,” and by most accounts this hot new field promises to revolutionize industries from business to government, health care to academia.

The field has been spawned by the enormous amounts of data that modern technologies create — be it the online behavior of Facebook users, tissue samples of cancer patients, purchasing habits of grocery shoppers or crime statistics of cities. Data scientists are the magicians of the Big Data era. They crunch the data, use mathematical models to analyze it and create narratives or visualizations to explain it, then suggest how to use the information to make decisions.

In the last few years, dozens of programs under a variety of names have sprung up in response to the excitement about Big Data, not to mention the six-figure salaries for some recent graduates.

In the fall, Columbia will offer new master’s and certificate programs heavy on data. The University of San Francisco will soon graduate its charter class of students with a master’s in analytics. Other institutions teaching data science include New York University, Stanford, Northwestern, George Mason, Syracuse, University of California at Irvine and Indiana University.

Rachel Schutt, a senior research scientist at Johnson Research Labs, taught “Introduction to Data Science” last semester at Columbia (its first course with “data science” in the title). She described the data scientist this way: “a hybrid computer scientist software engineer statistician.” And added: “The best tend to be really curious people, thinkers who ask good questions and are O.K. dealing with unstructured situations and trying to find structure in them.”

Eurry Kim, a 30-year-old “wannabe data scientist,” is studying at Columbia for a master’s in quantitative methods in the social sciences and plans to use her degree for government service. She discovered the possibilities while working as a corporate tax analyst at the Internal Revenue Service. She might, for example, analyze tax return data to develop algorithms that flag fraudulent filings, or cull national security databases to spot suspicious activity.

Some of her classmates are hoping to apply their skills to e-commerce, where data about users’ browsing history is gold.

“This is a generation of kids that grew up with data science around them — Netflix telling them what movies they should watch, Amazon telling them what books they should read — so this is an academic interest with real-world applications,” said Chris Wiggins, a professor of applied mathematics at Columbia who is involved in its new Institute for Data Sciences and Engineering. “And,” he added, “they know it will make them employable.”

Universities can hardly turn out data scientists fast enough. To meet demand from employers, the United States will need to increase the number of graduates with skills handling large amounts of data by as much as 60 percent, according to a report by McKinsey Global Institute. There will be almost half a million jobs in five years, and a shortage of up to 190,000 qualified data scientists, plus a need for 1.5 million executives and support staff who have an understanding of data.

North Carolina State University introduced a master’s in analytics in 2007. All 84 of last year’s graduates in the field had job offers, according to Michael Rappa, who conceived and directs the university’s Institute for Advanced Analytics. The average salary was $89,100, and more than $100,000 for those with prior work experience.

“This has become relevant to every company,” said Michael Chui, a principal at McKinsey who has studied the field. “There’s a war for this type of talent.”

Because data science is so new, universities are scrambling to define it and develop curriculums. As an academic field, it cuts across disciplines, with courses in statistics, analytics, computer science and math, coupled with the specialty a student wants to analyze, from patterns in marine life to historical texts.

With the sheer volume, variety and speed of data today, as well as developing technologies, programs are more than a repackaging of existing courses. “Data science is emerging as an academic discipline, defined not by a mere amalgamation of interdisciplinary fields but as a body of knowledge, a set of professional practices, a professional organization and a set of ethical responsibilities,” said Christopher Starr, chairman of the computer science department at the College of Charleston, one of a few institutions offering data science at the undergraduate level.

Most master’s degree programs in data science require basic programming skills. They start with what Ms. Schutt describes as the “boring” part — scraping and cleaning raw data and “getting it into a nice table where you can actually analyze it.” Many use data sets provided by businesses or government, and pass back their results. Some host competitions to see which student can come up with the best solution to a company’s problem.

University of San Francisco students have used data from General Electric to predict how much energy windmills could create. At North Carolina State, with data from the Postal Service, students have analyzed response rates to junk mail to find ways to improve its effectiveness.

Studying a Web user’s data has privacy implications. Using data to decide someone’s eligibility for a line of credit or health insurance, or even recommending who they friend on Facebook, can affect their lives. “We’re building these models that have impact on human life,” Ms. Schutt said. “How can we do that carefully?” Ethics classes address these questions.

Finally, students have to learn to communicate their findings, visually and orally, and they need business know-how, perhaps to develop new products.

“That’s one of the challenges,” said Terence Parr, program director of the analytics and computer science programs at the University of San Francisco. “To be successful, you need to have a wide range of skills that doesn’t fit in one department.”

The question, said Bill Howe, who teaches data science at the University of Washington, is whether it is even possible to instill in a single person all the skills needed, from statistics to predictive modeling to business strategy. The university’s offerings range from a free online course on Coursera to a nine-month certificate program to a Ph.D. track in Big Data.

“It remains to be seen,” he said, “but we’re still of the mind that a curriculum that aims to train data scientists is feasible.” He added: “What employers want is someone who can do it all.”

(From Claire Cain Miller, New York Times)

Thursday, April 11, 2013

Learning Java thru a Video Game

Computer scientists at the University of California, San Diego, have developed an immersive, first-person player video game designed to teach students in elementary to high school how to program in Java, one of the most common programming languages in use today.

The researchers tested the game on a group of 40 girls, ages 10 to 12, who had never been exposed to programming before. They detailed their findings in a paper they presented at the SIGCSE conference in March in Denver. Computer scientists found that within just one hour of play, the girls had mastered some of Java’s basic components and were able to use the language to create new ways of playing with the game.

“CodeSpells is the only video game that completely immerses programming into the game play,” said William Griswold, a computer scientist at the Jacobs School of Engineering at UC San Diego.

The UC San Diego computer scientists plan to release the game for free and make it available to any educational institution that requests it. Researchers are currently conducting further case studies in San Diego elementary schools.

Teaching computer science below the college level is difficult, mainly because it is hard to find qualified instructors for students in elementary to high school, Griswold said. So he and his graduate students set out to find a way to reach these students outside the classroom. They designed the game to keep children engaged while they are coping with the difficulties of programming, which could otherwise be frustrating and discouraging.

Teaching children how to program must be a priority in a society where technology is becoming more and more important, said Sarah Esper, one of the lead graduate students on the development of CodeSpells. Programming also teaches logical thinking, said Stephen Foster, another lead student.

“We’re hoping that they will get as addicted to learning programming as they get addicted to video games,” Foster said.

How CodeSpells works

CodeSpells’ story line is simple: the player is a wizard arriving in a land populated by gnomes. The gnomes used to have magic, but lost it at some point. The wizard must help them. She (or he) writes spells in Java. Players have seven spells available to them, including levitating objects within the game, flying and making fire.

Players can also earn badges by undertaking simple quests, which help them master the game’s spells. One quest entails crossing a river. Another entails rescuing a gnome from the roof of his cottage, where he got stuck. Yet another entails starting a large bonfire. By the time players complete the game’s first level, they have learned the main components of the Java programming language, such as parameters, for if statements, for loops and while loops, among other skills.
Testing the game

Researchers tested the game on a group of 40 girls ages 10 to 12 in San Diego. They gave the students a brief overview of the game’s mechanics, including how to write and edit code within the game’s user interface. The girls were divided in groups of two or three. Researchers encouraged them to explore the game and see what they could do. “We were purposefully vague,” they wrote, “as we hoped to encourage a largely unstructured learning environment.”

The students were disappointed when they had to stop playing because the test was over. Their interest in the game didn’t wane when they made mistakes while writing code. Instead, they used the mistakes as a stepping stone to explore the game’s possibilities. For example, one group made the mistake of levitating an object so high into the air that their wizard couldn’t reach it. So the girls made their wizard jump onto another object and levitated it high enough to reach the object they were after. The girls also reported feeling empowered. When they encountered a difficulty, they tried different spells and made changes to the code until they solved it.

Computer science learning theory

CodeSpells was influenced by research that Esper and Foster conducted on how successful programmers learn their trade. They surveyed 30 computer scientists and identified five characteristics that are key to learn programming outside a classroom setting: activities must be structured by the person who is trying to learn; learning must be creative and exploratory; programming is empowering; learners have difficulty stopping once they start; and learners spend countless hours on the activity.

Researchers summarized these findings in their SIGCSE 2013 paper, humorously titled “On the Nature of Fires and How to Spark Them When You’re Not There.”

Esper will present her CodeSpells work April 18 at Research Expo at the Jacobs School of Engineering at UC San Diego.

(From University of California - San Diego Website)

Wednesday, April 3, 2013

Free Software Developer School in Paris

French internet mogul Xavier Niel will open a new school in Paris just for software developers. Niel — who previously founded France’s first entrepreneurship school — is even putting up 20 million euros to keep tuition free. Known as 42, the school will focus on project-based learning and will allow students to set their own pace for learning, the French startup blog Rude Baguette reports. It’s expected to open in November.

In the late ’90s, Neil co-founded the first internet service provider in France, Worldnet, and in 2000, he sold the company for $50 million. By then, he had already founded Free, which is now the second-largest ISP in France. Last year, the company launched Free Mobile, which offers unlimited calls, texts and data for just $27 a month, about half the price of incumbent competitors, according to Forbes.

But Niel is very much the maverick, and controversy tends to follow him. When he was 19, he started an “erotic chat” service for Minitel, the French proto-internet that shut down just last year. In 1999, Free’s parent company, Iliad, was accused by France Télécom of pirating its reverse number look-up database — the companies settled out of court — and in 2005, Niel was arrested in relation to a prostitution scandal at some sex shops that he had invested in. He was cleared of the prostitution charges, but landed a two-year suspended sentence for failing to disclose income, Forbes reported. Free also caused a stir earlier this year when the company began blocking all web ads. The French government ended up stepping in and forcing Free to end the block, according to the Economist.

Niel is a self-taught programmer who never went to college, so it’s no surprise that 42 won’t be your ordinary school. There will be no lectures, according to the school’s FAQ. All the learning will be project-based, with an emphasis on “peer to peer” learning. And the school will teach not just programming skills, but also the habits that companies are looking for in programmers: productivity, collaboration and lifelong learning and self-investment, according to the school’s website. 42 won’t be an officially accredited school. Instead, Niel is banking on the school developing a reputation for itself.

The school will accept 1,000 students per year. Applicants must be between the ages of 18 and 30, but there will be no requirement for any sort of degree prior to enrollment. To apply, students will attempt to complete a series of games on the school’s website. Those who are able to complete the games will move on to a one-day “try out” at the school. Up to 1,000 students will be admitted per year. According to the FAQ, the students need not already know how to program to be selected.

The school echoes some U.S. educational initiatives, such as Code.org and Dev Bootcamp. Code.org — backed by the likes of Bill Gates and Mark Zuckerberg — is attempting to expand and improve computer science education in public schools in the United States. Dev Bootcamp offers a nine-week intensive training in programming and, like 42, it focuses on project-based learning and peer-to-peer education.

Monday, March 4, 2013

New Technology for Animation

New technology for animation film experts
Movie heroes are soon to be transferred to virtual worlds more easily and realistically
CEBIT 2013. Germany. February 27, 2013

Whenever computer-animated characters roam through wild landscapes, such a Gollum in Lord of the Rings, there were real actors at work. Film studios usually use a procedure called 'motion capture'. The actors wear skin-tight suits with markers attached to them reflecting beams of infrared light that are sent out and received by a special camera system. In this way, the movements of a real actor are recorded and can later be transferred to a virtual character, using animation software. “However, the suits are very uncomfortable for the actors, and the markers interfere with their movements”, says Nils Hasler from the Max Planck Institute for Informatics in Saarbrücken. For this reason, the Computer Graphics researchers there have developed a method that eliminates the need for markers but captures the movements quickly and realistically.

This method allows actors in their normal clothing to be filmed with ordinary cameras. The movements are then analysed with special computer software and transferred to a virtual character in the form of a skeleton. “We require only a few cameras instead of the several dozen cameras needed for the special effects in Hollywood. The movements are computed so quickly that we can transfer them directly to the animated character without time delay,” Hasler explains. The meanwhile patented computation method has been further refined in the past months. It can now deal with scenes in which several participants are simultaneously active and body parts overlap. “The system even detects a person’s movements when they are covered up by other objects or when there are disturbances in the background. This will allow us to shoot visual effects outside of the studio in the future, for example, out in open nature,” the Saarbrücken-based researcher reckons.

The computer scientists in Christian Theobalt’s “Graphics, Vision & Video” team were able to solve yet another problem in the past few months, as Hasler explains proudly: “It was difficult for our software to reconstruct the body movements of actors wearing big coats or women entering a scene in long ballroom dresses. Our new computation method enables us to capture surfaces in such precise detail that, e.g., the draping folds of clothing can be reproduced realistically.” The new technology is also useful in areas outside the film and game industry. Athletes could use it to analyse specific, individual body movements without bothersome markers. Sports journalists would be able to comment on motion sequences, like in pole vault and discus competitions, in live television broadcasts.

“The field of medicine would also profit. It would be easier for doctors to depict and track the degree of recovery after operations on joints,” Hasler explains. The researcher from the Max Planck Institute for Informatics wants to establish a company together with Professor Christian Theobalt and his research colleague, Carsten Stoll, in order to offer their software as a commercial product. “We have already had quite a few inquiries from companies in the film and sports marketing industries,” Hasler reveals.
Technical Background

The technology used in this method is quite affordable. Anywhere from five to twelve ordinary video cameras are needed. The computer scientists use their software to produce a 3-D model of the depicted person from a skeleton with 58 joints. In order to capture the movements, the computation method continuously works on overlaying the two-dimensional image from the video camera and the 3-D model as exactly as possible. The researchers can solve the necessary equations for this task efficiently and quickly. With this method, they capture filmed movements and visualize them in the virtual characters within just a few milliseconds.
Computer Science on Saarland University Campus

Saturday, March 2, 2013

The Learning Virtues

The Learning Virtues
By DAVID BROOKS
New York Times, February 28, 2013

Jin Li grew up in China during the Cultural Revolution. When the madness was over, the Chinese awoke to discover that far from overleaping the West, they were “economically destitute and culturally barren.” This inspired an arduous catch-up campaign. Students were recruited to learn what the West had to offer.

Li was one of the students. In university, she abandoned Confucian values, which were then blamed for Chinese backwardness, and embraced German culture. In her book, “Cultural Foundations of Learning: East and West,” she writes that Chinese students at that time were aflame — excited by the sudden openness and the desire to catch up.

Li wound up marrying an American, moved to the States and became a teacher. She was stunned. American high school students had great facilities but didn’t seem much interested in learning. They giggled in class and goofed around.

This contrast between the Chinese superstudent and the American slacker could be described with the usual tired stereotypes. The Chinese are robots who unimaginatively memorize facts to score well on tests. The Americans are spoiled brats who love TV but don’t know how to work. But Li wasn’t satisfied with those clichés. She has spent her career, first at Harvard and now at Brown, trying to understand how Asians and Westerners think about learning.

The simplest way to summarize her findings is that Westerners tend to define learning cognitively while Asians tend to define it morally. Westerners tend to see learning as something people do in order to understand and master the external world. Asians tend to see learning as an arduous process they undertake in order to cultivate virtues inside the self.

You can look at the slogans on university crests to get a glimpse of the difference. Western mottos emphasize knowledge acquisition. Harvard’s motto is “Truth.” Yale’s is “Light and truth.” The University of Chicago’s is “Let knowledge grow from more to more; and so be human life enriched.”

Chinese universities usually take Confucian sayings that emphasize personal elevation. Tsinghua’s motto is “Strengthen self ceaselessly and cultivate virtue to nurture the world.” Nanjing’s motto is “Be sincere and hold high aspirations, learn diligently and practice earnestly.”

When Li asked Americans to randomly talk about learning they used words like: thinking, school, brain, discovery, understand and information. Chinese, on the other hand, tended to use phrases common in their culture: learn assiduously, study as if thirsting or hungering, be diligent in one’s learning.

In the Western understanding, students come to school with levels of innate intelligence and curiosity. Teachers try to further arouse that curiosity in specific subjects. There’s a lot of active learning — going on field trips, building things. There’s great emphasis on questioning authority, critical inquiry and sharing ideas in classroom discussion.

In the Chinese understanding, there’s less emphasis on innate curiosity or even on specific subject matter. Instead, the learning process itself is the crucial thing. The idea is to perfect the learning virtues in order to become, ultimately, a sage, which is equally a moral and intellectual state. These virtues include: sincerity (an authentic commitment to the task) as well as diligence, perseverance, concentration and respect for teachers.

In Chinese culture, the heroic scholar may possess less innate intelligence but triumphs over hardship. Li cites the story of the scholar who tied his hair to a ceiling beam so he could study through the night. Every time his head dropped from fatigue, the yank of his hair kept him awake.

Li argues that Westerners emphasize the Aha moment of sudden insight, while Chinese are more likely to emphasize the arduous accumulation of understanding. American high school students tease nerds, while there is no such concept in the Chinese vocabulary. Western schools want students to be proud of their achievements, while the Chinese emphasize that humility enables self-examination. Western students often work harder after you praise them, while Asian students sometimes work harder after you criticize them.

These cultures are surprisingly enduring, Li notes, even with all the cross-pollination that goes on in the world today. Each has its advantages. I’m mostly struck by the way the intellectual and moral impulses are fused in the Chinese culture and separated in the West.

It’s easy to see historically why this came about. Hellenic culture emphasized skeptical scientific inquiry. With us, religion and science have often been at odds. We’re a diverse society, so it’s easier to teach our common academic standards in the classroom and relegate our diverse moralities to the privacy of the home.

I’d just note that cultures that do fuse the academic and the moral, like Confucianism or Jewish Torah study, produce these awesome motivation explosions. It might be possible to champion other moral/academic codes to boost motivation in places where it is absent.

Wednesday, February 27, 2013

IT Security Skills Needed

IT Security Organizations Facing Shortage of Skilled Professionals
eWeek (02/25/13) Brian Prince

There is a severe shortage of skilled cybersecurity professionals, according to an International Information Systems Security Certification Consortium survey of more than 12,000 information security professionals from around the world. The survey found that 56 percent of respondents said their organizations are short-staffed. Among the study's notable findings, communication skills was the second most commonly cited factor for success, cited by 91 percent, right behind "a broad understanding of the security field," and leadership skills were mentioned by 68 percent of respondents. "I think there's an understanding--not only on the part of professionals in this industry but also on the part of hiring managers--that a really good information security professional not only has the technical knowledge but also has a desire to stay on top of their field and have those broad managerial skills," says the consortium's Julie Peeler. Security certifications are viewed as a reliable indicator of competency when hiring, cited by nearly 70 percent. The average annual salary is $101,014 and is 33 percent higher than the average annual salary of those without the certification. The number of security professionals is expected to rise steadily around the globe by more than 11 percent annually over the next five years.

Thursday, February 7, 2013

What is a PC?

Article by Doug Aamot (Time Magazine) Feb 7, 2013

The question that makes up the title of this post was the very same question that led a post written by my colleague Harry roughly a year ago.

Harry’s post cited a report by research firm Canalys that said tablets, which Canalys refers to as “pads,” made up 22% of worldwide PC sales during the fourth quarter of 2011. Yes, Canalys was counting tablets as PCs.

Fast forward to today, and Canalys has just released the numbers of PC sales for the fourth quarter of 2012. Still counting tablets as PCs, that 22% grew to 33%. Canalys also relays that “one in six PCs shipped in Q4 2012 was an iPad.”

This brings us back to the “What is a PC?” question. Last year, Harry informally surveyed his Twitter followers and found the following general consensus:
A PC runs apps. The owner gets to define the device’s capabilities by installing software on it–and, these days, by using it with Web-based services.
It’s a general-purpose device. You can use one to write a novel, balance a checkbook, listen to a symphony, design a jumbo jet or pretend you’re Batman. True, most people don’t use a single PC for quite so dizzying an array of tasks–but they could.
It’s designed principally for use by one person at a time. That’s where the “personal” in “personal computer” comes in. It’s a computer for you, which was a pretty radical notion when the PC first got going in the mid-1970s, and still a liberating one.
It can be of any size. Which means that smartphones are PCs, since they run general-purpose software and are generally used by one person. In an era of products such as Samsung’s Galaxy Note, which is as much tablet as phone, I see no reason to declare that something isn’t a PC simply because it fits (just barely) into a pocket.
I’d say that consensus still holds up today, and I’d add that as each year passes, we’ll probably put less and less effort into trying to define what constitutes a personal computer and what doesn’t. Though some have argued that we’re in the “post-PC” era, I’d argue the so-called era to be a short one. “Post-PC” simply acts as a temporary phrase used to differentiate between a box on your desk or something with a hinge on your lap, a slab you poke at with your finger or a smaller slab you poke at with your finger and that fits in your front pocket and can make phone calls.
But in the not-too-distant future, if you’re using a personal device that computes things, it’s not going to matter what you call it. It might not even be referred to as a PC or a tablet or a smartphone. Within the next 10 years or so, my bet’s on each person carrying a single highly portable, super powerful device that docks into inexpensive desktop, laptop, tablet, TV, car dashboard and smartphone shells, depending upon the usage scenario. One tiny computer that can shape-shift as needed: that’s the definition of the PC I want.