@nathaniel
Nothing here yet.
Nothing here yet.
No blogs yet.
In general, proprietary software seems to have a shorter lifespan. Here's a snapshot on what happened to some software that I've used since around 2000 (plus/minus a few years) and what happened since then: Microsoft Visual Basic (no longer using) - moved to Python Microsoft Visual C++ (no longer using) - moved to vim/Atom, compile with gcc or intel compilers. Adobe Photoshop CS2 (no longer using; circumstances changed) Matlab 5 (no longer using) - moved to Python/Matplotlib for plotting; numerical libraries or hand-coding for number crunching. Netscape (no longer using) / Firefox (rarely use) - moved to Safari / Chrome / Chromium Microsoft Office (still using, but only occasionally; I also use Google Docs / Open Office / Apple Pages/Numbers/Keynote. Apple Mail / web-based mail / mobile-based mail has largely replaced Microsoft Outlook & Outlook Express) Adobe Acrobat Professional (still using, but only occasionally; I frequently use Apple's Preview, PDF Expert, and Evince as alternatives) Bash shell & unix commands (still using) Software that I picked up around the 2007-2010 period (around this period open source software picked up significantly in terms of quality and/or popularity) Apple Aperture (rarely use anymore; circumstances changed) - deprecated GNU Compiler Collection, etc: gcc, gfortran, make (still using) OpenMPI / MPICH (still using) Python (still using) Subversion (no longer using) - replaced by Git Git (still using) VMWare Fusion (no longer using) - replaced by Virtualbox Virtualbox (still using) Apple Xcode (rarely use; circumstances changed) Survival Scorecard - 75% survival rate for open source vs 20% survival rate for proprietary software Proprietary Software (20% survival rate): 2 [MS Office & Acrobat] vs 8 [VB, VC++, Photoshop, Matlab, Netscape, Aperture, VMWare, XCode] Open Source Software (75% survival rate): 6 [Bash, GCC, MPI, Python, Git, Virtualbox] vs 2 [Subversion] Quality Scorecard proprietary is better: count=4 [e.g. Microsoft Office, Adobe Acrobat Pro, Photoshop, Aperture] open source is better or "good enough" that the proprietary equivalent (if any) isn't necessary or the cost of the proprietary equivalent is hard to justify: count=7 [e.g. Python, Matplotlib, bash, git, gcc, OpenMPI/MPICH, Virtualbox]
I agree that AI may be hyped up a little too much. Intelligence isn't linear, but multi-dimensional (more about this here ). As another example, ATMs failed to put bank tellers out of a job , and in fact the number of bank tellers increased. Exponential growth in AI is going to require exponential growth in data (more data allows more complex algorithms--the general guideline is that you need 10 data points for every parameter/ VC dimension in your algorithm) and exponential growth in compute power to train your algorithms on that same data. But Moore's law is dead after 50 years (Intel's tic-toc strategy has become to tic-toc-toc). The development of the silicon chip + Moore's law exponential growth (transistors double every 18 months) transformed many industries and disrupted many jobs, but the rate of disruption was manageable. It's hard to imagine AI disruption going to exceed the previous levels of disruption without something better than Moore's law growth (algorithm complexity has to double << 18 months). I'm not saying that there'll be no job disruption at all, I'm just saying that the rate job disruption isn't necessarily going to be much larger than in past. Is it even possible to say that the job disruption rate is going to apply equally across all countries? Lastly, people should be aware that there's lots of reasons to hype up AI. It's always nice for startups to see their valuations go sky high... same for the stock price of listed companies. AI hype helps people get more support (or funding) for their AI ideas and projects. People also get to sell more copies of your book, call themselves an AI guru, etc.
Yes and no. Machine learning is a bit like an advanced generalisation of linear regression. And it's successful in a number of "use cases" (that may be hyped up), but the areas where it isn't successful don't get into the news. There are a lot of reasons why machine learning might not end up being the best choice, such as the following: the cost of getting the data vs the potential revenue gain from the data: in many cases the ratio between these to isn't known, so it's difficult to build a business case from it. data could be scattered all over the organization, and may not always be in nice "Excel tables" (it could be locked up in images, emails, handwritten log books, etc.) not having enough data points data quality, the signal to noise ratio in the data algorithms become "good" because a tradeoff is being made (the tradeoff makes the algorithm better at one particular problem, but this comes at the expense at poorer performance at other problems). This argument follows the "no free lunch theorem" (which has been mathematically proven ) explained here . another example is that if there's already a physics-based model which describes the data, the machine learning model could fail to outperform it. extrapolation: regression works best when you interpolate between data points within training data, but starts to perform poorly when you need to extrapolate (make predictions outside the data it was trained on). Likewise, machine learning models tend to perform well within the training data, but if the new data has little resemblance to the original training set, expect the predictive performance to be poor. So while machine learning is going to be awesome in some specific areas, be also aware that the non-success stories don't get hyped up.
Both. I'm thinking along the lines of being a " T-shaped professional ": deep skills in about 1-3 languages (in my case, it's C & Python). wide skill set across a handful of languages & skill sets (entry-level to intermediate knowledge). In many cases, languages such as bash/unix command line, html, and SQL would go here. In terms of skill set: git would go here (git isn't really a language) Specialize enough in one area so that people know you are the "expert" or "go to person" for anything related to that area. This is the area people remember you for. But being broad across skill sets helps in working with others as well as having a broader perspective.
Writing "hello world" on punch cards. But for something that's typed into a computer via a keyboard (and in a language in common use): Any OpenCL hello world code where you're trying to run on GPUs or FPGAs: two examples - OpenCL hello world for GPUs , or this one from Intel for FPGAs . Windows API hello world For esoteric programming languages , I like this L33t hello world .
Yes. However, the reality is that sometimes you don't actually know how many bugs there are in the code--a bug may not be just one line of code, it might be a lot more. Or the bug might be due to a incorrect assumption that was made, and if that assumption turns out to be wrong, multiple chunks of new code need to be written.
interface--practically every major language has a way to interface with code written in C/C++ or it interfaces with C/C++ libraries in some way. speed--write optimised / number-crunching code (that can still be accessed from your other code written in another language). computer architecture--learn about getting close to the "bare metal"... pointers, registers, cache, and the "volatile" keyword; learn about instruction set architecture (ISA), advanced vector extensions (AVX & AVX2), #pragma simd, prefetching, memory alignment. know the purpose of certain instruction sets developed for certain architectures (AVX-512 for Intel Knight's Landing, CUDA code optimised for Nvidia GPUs). or at least learn enough so that you can appreciate some of the points raised in Ulrich Dripper's article "What every programmer should know about memory". not many languages out there give you a chance to get that crazy of level of control, or allow you to build an appreciation for some of these things that are happening "under the hood". basics can be learned very quickly--the basics of C can be picked up over a weekend. better still, what if the bottleneck in your code is just one or two loops, which could easily be re-written in an optimised form with just a few lines of C? see point #1 and point #2. embedded systems / IoT / Arduino, etc. C is a very popular language for these kinds of systems. also, having a small footprint helps. historical--many things in C/C++ influenced other languages. learn a bit of the history of how various concepts in programming languages evolved. have a bit of appreciation for why it has survived for so long. operating system--learn about (or be able to access) Linux system calls (and perhaps other operating systems as well, besides Linux) and strace (track system calls made by an executable). here is a cheatsheet of Linux system calls: http://www.digilife.be/quickreferences/qrc/linux%20system%20call%20quick%20reference.pdf and going back to point #1, if your programming language of choice lacks access to a certain system call that you need, it might be possible to access it through C.
I graduated from university in 2000. At that time, I had done a bit of coding in Quickbasic/Visual Basic, C/[Visual] C++, Fortran, and Matlab. There was a lot less to learn then (picking up the basics of C, Fortran, Matlab, or Basic could be done in a weekend, and a reasonable proficiency could be achieved within a month or so). Documentation, however, was really sub-par, and there was no Stack Overflow (and finding an answer on the web via search engines like Alta Vista / Infoseek / Yahoo wasn't easy). So while the basics were easy (e.g. code a 100 line program in C/Fortran/Basic), much of the advanced stuff was hard--for example, trying to figure out some user-interface thing in Visual C++ via the Microsoft Foundation Class (MFC) documentation was a real pain for me. I had a lot of trouble figuring out how to use libjpeg to read a jpeg file (I never succeeded). Now it's 2017. I still code in C/C++ for number crunching stuff, but I've moved to Python for all the high level stuff. It's now a lot easier to do certain things (use Python to read image files, make plots with Matplotlib, interface with SQL / no-SQL databases, or do statistics/machine learning). Building a basic GUI or web interface is generally easier. There's Stack Overflow, plenty of online MOOCs, and countless other places to go on the web for information. On the other hand, there's a lot more to master, and so many competing languages and frameworks/libraries to choose from. Even for a simple language like C, knowing how to optimise it can be difficult to really master (think cache optimizations, #pragma simd, hybrid MPI+Open MP, or compilation/optimisation for NVidia's GPUs or Intel's Xeon Phi). And for any language, if you want to do things properly, unit tests and version control are often standard practice (unlike in 2000). So I think to become an "expert" in some particular area, it's probably not very different from 2000. On the other hand, if you just "want to get something done", or you want to do a "minimum viable product", for many things, it's going to be considerably easier now. 1000 lines of code will get you a lot farther in 2017 compared to 2000.