Archive for the ‘Computing’ Category

The 2017 Top Programming Languages

November 4th, 2017 No comments

From: IEEE Spectrum
The 2017 Top Programming Languages
Python jumps to No. 1, and Swift enters the Top Ten
By Stephen Cass
Date: July 18, 2017

It’s summertime here at IEEE Spectrum, and that means it’s time for our fourth interactive ranking of the top programming languages. As with all attempts to rank the usage of different languages, we have to rely on various proxies for popularity. In our case, this means having data journalist Nick Diakopoulos mine and combine 12 metrics from 10 carefully chosen online sources to rank 48 languages. But where we really differ from other rankings is that our interactive allows you choose how those metrics are weighted when they are combined, letting you personalize the rankings to your needs.

We have a few preset weightings—a default setting that’s designed with the typical Spectrum reader in mind, as well as settings that emphasize emerging languages, what employers are looking for, and what’s hot in open source. You can also filter out industry sectors that don’t interest you or create a completely customized ranking and make a comparison with a previous year.

So what are the Top Ten Languages for the typical Spectrum reader?

[Click here to explore interactive rankings]

Python has continued its upward trajectory from last year and jumped two places to the No. 1 slot, though the top four—Python, C, Java, and C++—all remain very close in popularity. Indeed, in Diakopoulos’s analysis of what the underlying metrics have to say about the languages currently in demand by recruiting companies, C comes out ahead of Python by a good margin.

C# has reentered the top five, taking back the place it lost to R last year. Ruby has fallen all the way down to 12th position, but in doing so it has given Apple’s Swift the chance to join Google’s Go in the Top Ten. This is impressive, as Swift debuted on the rankings just two years ago. (Outside the Top Ten, Apple’s Objective-C mirrors the ascent of Swift, dropping down to 26th place.)

However, for the second year in a row, no new languages have entered the rankings. We seem to have entered a period of consolidation in coding as programmers digest the tools created to cater to the explosion of cloud, mobile, and big data applications.

Speaking of stabilized programming tools and languages, it’s worth noting Fortran’s continued presence right in the middle of the rankings (sitting still in 28th place), along with Lisp in 35th place and Cobol hanging in at 40th: Clearly even languages that are decades old can still have sustained levels of interest. (And although it just barely clears the threshold for inclusion in our rankings, I’m pleased to see that my personal favorite veteran language—Forth—is still there in 47th place).

Looking at the preset weighting option for open source projects, where we might expect a bias toward newer projects versus decades-old legacy systems, we see that HTML has entered the Top Ten there, rising from 11th place to 8th. (This is a great moment for us to reiterate our response to the complaint of some in years past of “HTML isn’t a programming language, it’s just markup.” At Spectrum, we have a very pragmatic view about what is, and isn’t, a recognizable programming language. HTML is used by coders to instruct computers to do things, so we include it. We don’t insist on, for example, Turing completeness as a threshold for inclusion—and to get really nitpicky, as user Jonny Lin pointed out last year, HTML has grown so complex that when combined with CSS, it is now Turing complete, albeit with a little prodding and requiring an appreciation of cellular automata.)

Finally, one last technical detail: We’ve made some tweaks under the hood to improve the robustness of the results, especially for less popular languages where the signals in the metrics are weaker and so more prone to statistical noise. So that users who look at historical data can make consistent comparisons, we’ve recalculated the previous year’s rankings with the new system. This could lead to some discrepancies between a language’s ranking in a given year as currently shown, versus the ranking that was shown in the original year of publication, but such differences should be relatively small and not affect the more popular languages in any case.

Categories: Computing, IEEE, Teaching Technology Tags:

Watch “The Computer Hack That Saved Apollo 14” on YouTube

September 22nd, 2017 No comments

Categories: Computing, Space Tags:

The Guy Who Invented Those Annoying Password Rules Now Regrets Wasting Your Time

August 8th, 2017 No comments

From: Gizmodo

We’ve all been forced to do it: create a password with at least so many characters, so many numbers, so many special characters, and maybe an uppercase letter. Guess what? The guy who invented these standards nearly 15 years ago now admits that they’re basically useless. He is also very sorry.

The man in question is Bill Burr, a former manager at the National Institute of Standards and Technology (NIST). In 2003, Burr drafted an eight-page guide on how to create secure passwords creatively called the “NIST Special Publication 800-63. Appendix A.” This became the document that would go on to more or less dictate password requirements on everything from email accounts to login pages to your online banking portal. All those rules about using uppercase letters and special characters and numbers—those are all because of Bill.

The only problem is that Bill Burr didn’t really know much about how passwords worked back in 2003, when he wrote the manual. He certainly wasn’t a security expert. And now the retired 72-year-old bureaucrat wants to apologize.

“Much of what I did I now regret,” Bill Burr told The Wall Street Journal recently, admitting that his research into passwords mostly came from a white paper written in the 1980s, well before the web was even invented. “In the end, [the list of guidelines] was probably too complicated for a lot of folks to understand very well, and the truth is, it was barking up the wrong tree.”

Bill is not wrong. Simple math shows that a shorter password with wacky characters is much easier to crack than a long string of easy-to-remember words. This classic XKCD comic shows how four simple words create a passphrase that would take a computer 550 years to guess, while a nonsensical string of random characters would take approximately three days:

This is why the latest set of NIST guidelines recommends that people create long passphrases rather than gobbledygook words like the ones Bill thought were secure. (Pro tip: Use this guide to create a super secure passcode using a pair of dice.)

Inevitably, you have to wonder if Bill not only feels regretful but also a little embarrassed. It’s not entirely his fault either. Fifteen years ago, there was very little research into passwords and information security, while researchers can now draw on millions upon millions of examples. Bill also wasn’t the only one to come up with some regrettable ideas in the early days of the web, either. Remember pop-ads, the scourge of the mid-aughts internet? The inventor of those is super sorry as well. Oh, and the confusing, unnecessary double slash in web addresses? The inventor of that idea (and the web itself) Tim Berners-Lee is also sorry.

Technology is often an exercise of trial and error. If you get something right, like Jeff Bezos or Mark Zuckerberg have done, the rewards are sweet. If you screw up and waste years of unsuspecting internet users’ time in the process, like Bill did, you get to apologize years later. We forgive you, Bill. At least some of us do.

[Wall Street Journal]

Categories: Computing Tags:

Turing Tumble: Gaming on a Mechanical Computer by Paul Boswell — Kickstarter

June 3rd, 2017 No comments
Categories: Computing, IEEE, Teaching Technology Tags:

Article: A Sensor That Could Soon Make Homes Scary-Smart

May 11th, 2017 No comments

A Sensor That Could Soon Make Homes Scary-Smart

Categories: Computing Tags:

Article: Scientists Achieve Direct Counterfactual Quantum Communication For The First Time

May 10th, 2017 No comments

Scientists Achieve Direct Counterfactual Quantum Communication For The First Time

Categories: Computing Tags:

Article: Learn any of these 16 programming languages and you’ll always have a job

May 7th, 2017 No comments

Learn any of these 16 programming languages and you’ll always have a job

Categories: Computing Tags:

Rubber Duck Debugging – Rubber Duck Debugging – Debugging software with a rubber ducky

April 24th, 2017 No comments
Categories: Computing, Uncategorized Tags:

State Progress on K-12 Computer Science Ed Policies: ‘We Have a Long Way to Go’

April 24th, 2017 No comments
Categories: Computing, Teaching Technology Tags:

Woman Pioneer Broke New Ground

April 11th, 2017 No comments

From: IEEE – The Institute

Grace Murray Hopper, 1906–1992

Known for: Inventing the computer compiler and leading the development of the programming language COBOL (common business-oriented language).

Why it matters
: Hopper is considered one of the founders of the information age. Her compiler, a collection of coded instructions that could be reused, saved programmers from having to write each program anew. It significantly advanced the art of programming. By the late 1970s, COBOL was the most extensively used computer language in the world.

Where she started: Hopper was a mathematics professor at Vassar College, in Poughkeepsie, N.Y., when she joined the U.S. Navy Waves (women accepted for voluntary service) program in December 1943. She was commissioned a lieutenant the following year. She was named an IEEE Fellow in 1962 “for contributions in the field of automatic programming.”

Breakthrough: As a Navy lieutenant, she was assigned in 1944 to program the Mark I Automatic Sequence Controlled Calculator at Harvard under Howard Aiken, a computing pioneer. The Mark 1, one of the first programmable computers, is an IEEE Milestone.

Categories: Computing, IEEE Tags: