There are some places that are just snobby about degrees - I know I wouldn't have been hired at the first place I worked without a degree of some kind. Other places wouldn't look at me because I didn't have a Masters.
But things have changed. There's much more emphasis now on what you can do, rather than what you've learned.
Since then, I've worked with all kinds of people and I've found that a degree has no bearing on the effectiveness of a programmer.
Self-taught people sure don't have the same background in algorithms and data structures as grads, but the good ones quickly find out when a particular algo/data structure is required, and then how to use it.
in my case i was behind on algorithms and advanced data structures because it's so rare that you actually need to build or use them beyond the library interfaces that are part of the language you are using. but when i finally dug in to learn them, i realized that people who don't know algorithms tend to avoid the problems that are solved by knowing algorithms.
you can have a complete and full career without ever touching or understanding algorithms, but i do think that the formal education exposure to this topic is useful and not at all obvious for self-taught software devs.
In my experience, it's been the assumptions about knowledge gained from having a degree in computer science. Many developers who hold this degree have assured me they didn't learn anything practical / have forgotten everything they learned. I've then heard the same devs crack jokes with each other about linked lists, and they also seem to instinctively know how libraries written by others were most likely built.
I think that some folks with a degree in computer science take for granted the small snippets of wisdom that help them do their job well every day. It's easy to forget where you learned something, or why a certain concept is straightforward to you at first glance.
For many years of my career, not knowing a lot about data structures and common algorithmic patterns definitely affected how fast I learned new concepts, and had a negative impact on quality of the architectural patterns I designed. When I mirrored other patterns I saw, I often didn't understand why it was a good pattern. Of course, not everyone would experience this, but personally this was true for me. I know many self taught folks who never had these learning problems like I did (perhaps they're a lot brighter than me).
I am self taught and over the past couple of years have filled in some of the CS learnings I missed out on. It was like a light bulb moment for me, where everything had so much more clarity than it had in the past. Additionally EVERYTHING WAS SO INTERESTING TO LEARN OMG. I LOVE hashtable theory, convex hull solutions, runtime vs memory usage, bloom filters etc. etc.
As a self taught programmer, I still struggle with feeling like I belong in this industry, but it's never too late to go back if you know you have gaps. It's made me love computers even more.
Learning yourself will make you a better problem solver. Researching answers as to why things don't work rather than being told exactly what do it. (Kids these days are lucky, sure I was 12 in 1994 when I started to learn, but I didn't use the internet until 2000!)
Learning yourself may however direct you into bad practices and doing things the wrong way. Whilst there are often many ways to achieve success with code, some of them are technically incorrect. I've seen this a few times in my career.
The absolute best way to learn is from people that are better than you - that is, become a junior developer in an agency that has multiple middleweight and senior developers. I did teach myself BASIC, I did go onto QBASIC during a subject that included programming at school (the teacher only knew RM BASIC, didn't have a clue what I was doing), I did formally learn (the basics of) C++ at university. I do have a degree. And it really bugs me when I see job adverts asking for developers to have a degree and experience - a few years experience alone is worth much more than a degree.
I think most problems are:
Being able to write up some bubble sort code off the top of your head might be impressive in a job interview, but it's about as useful as being able to recite the alphabet backwards in a second language. In short; you're never going to actually do it in real life.
The main difficulty I run into: people thinking because I don't have a degree means I can't do the job.
In point of fact, I can probably do it better than your average graduate because I know how to Google. Seriously. I have always held the notion that one of the most important skills one can possess is knowing how to learn their ways around problems. Crafting decisive search queries to get the most direct result is the key to success.
Let me illustrate this: ever been sitting behind someone as they Google something and been annoyed by the quality of their search query? That's likely because you're a better Googler than they are lmfao. You know that what they're typing isn't going to get you what you want, or at least not a quickly/directly.
For example:
When did Abraham Lincoln die?
Is terribly inefficient, lincoln death will produce nearly the same results and is a lot quicker to type. That's an extremely contrived example, but I hope you get the point I'm trying to make. Because I learned from Google, StackOverflow, and GitHub issues instead of a textbook, I'm going to beat you to the fix 9 times out of 10.
This irks me to no end. 98% of the code we write is using other people's code so a lot of times it's more important to be good at finding and then using other people's code than it is being able to reinvent the wheel yourself. And people don't understand how important that is, and that kills meh.
Always wondering if you're doing it the "right" way. That and _possibly _missing out on the jobs that require a degree. I have been bitten by both. There's also the condescending tone that other developers take when they find out you don't have a degree.
Other than that, I feel (very biased, of course, since I am self-taught) that self-taught developers have significant advantages. One of which is passion. That is why, nowadays, I prefer to hire passionate developers, not differentiating between self-taught and degreed. A developer who is degreed may not have the same passion as a self-taught developer, only working for the paycheck. That's not to say all degreed developers are not passionate. That's not at all the case. After all, they went for the degree for a reason, right? But, I tend to find more passion among self-taught developers.
It's been a ride for sure -
Started dabbling in web development in '96 (16 years old)
Did some VB development at a bank in 2000. Started my first website (a car club) in 2000.
Got my first "real" programming job in 2002 in FoxPro.
Never held a job since that didn't require some sort of programming (was a systems administrator for 8 years but did web development there also and freelance on the side)
A job I had 3 years ago - everyone there had a college degree (I was the only one without). Every single one of them came to me for help / direction / how do I do "x" because I had the most experience.
I don't recall the last time the lack of a college degree effected me getting a job. I'm not and never will look for a job at Google or other (not my type of scene) but I can't recall not being hired because of lack of degree.
As for difficulties -
First that comes to mind is - where to start. As often asked here "how do I start learning X framework or Y language" - sometimes, the shear amount of tutorials and videos and etc... online make it hard to figure out what's good and what's garbage; what to start with (looking at you, React) and where to go once you've gotten the basics down. That said, being self taught has taught me how to search and what to search for, so these issues are usually just road bumps.
Getting your start is another PITA - finding that first job where I was actually called a "programmer" was a bit difficult. Freelance helped me a lot to accomplish that - it filled out my resume nicely. I still freelance because I like to build the random site (have an meeting today, in fact) and IMO - it's always a good idea to show extra curricular projects.
Respect by family was always a burden - Back in the day it was all - What'd you going to do without a degree? You'll never get a good job. Now a days it's - What'd you working on now and what'd you learning? And - we're so proud of you.
Remembering your roots is another thing to keep you in check. I've worked in PHP for sooooooo long. I can write it better then I can write english (and I was born in the USA, haha). After you do something in a given language for so long, it's hard to accept new things. Why use React when I can just do it in AJS / PHP?!? I don't want to learn X. I'm to old for this S**T. NO. Just, stop! I'm currently unemployed (technically, I'm still making money from my side projects) - but this is exactly the time that needs to be spent going back to roots and learn React or Swift or etc... I'm getting tired of writing in the same languages I've written in for the last decade and a half. I really want an app development job in Swift. So you need to focus, realize this industry changes monthly (weekly?) and keep at it. There are few jobs you ever stop learning something new (none come to mind), but programming is notorious for you will never stop learning.
Bulletpoints:
I am all for self-teaching; it's how I learned. That said, I've been bitten by it a few times as well, so, this is speaking from experience.
With self-teaching, what constitutes "knowing" a thing? Getting it to work or understanding what it does? Using code you don't truly understand can lead to problems down the line (things like public vs private variables, scope, race conditions, too much resource consumption, loose security, etc).
Additionally, extra effort needs to be made to pay attention to how the industry expects you to work. Things like source control, frameworks, package managers, etc...none of that stuff is NEEDED to make an application and there are many of those things out there to suck up your time and brainspace...however, if you plan to work with other developers in the future, they will want to use these things because that's what the industry is doing and you may have to unlearn what you've learned in order to adopt them.
In my short work experience I found self-taught developers to perform better than formally educated ones.
The reason I gave myself for this is that a very relevant part of our job is spent in finding, reading and understanding documentation (and when documentation is not there, to find a solution anyway or die trying). As a self-taught developer, you will have spent years looking for and interpreting learning resources, while educated ones may have been spoon-feed.
Even worse, a student may (read "will") fall in the trap of only learning the strictly required material needed to pass the course and will rarely expand its knowledge using the force of the curiosity, which is what moves a self-taught developer in the first place.
Getting back to difficulties:
Being a self-taught developer is still great.
Usually missing basics, that's the main issue of a missing theoretical education.
This is not really a big issue as long as you learn them along side. Don't just practice as a work implementations. They are very important and will help you get better as well but as always balance is key.
Try algorithms yourself and watch some youtube videos from the different universities or other sources. Esp try to really understand why certain approaches are better in certain use-cases.
Other difficulties could be elitism, the imposter syndrom, you need to have 10 years of experience but no one is willing to give you a chance is one of my favorites.
Steven Kollerer
I am still rather young (Just turned 21) and only have a hand of finger counting of years of programming(c++ only) on me, but so far the greatest difficulty I face is the insecurity of not knowing if what you do is actually good or bad. Since you usually find out everything yourself and some documentation which only tells you what X does, you will do what you think is good while asking yourself whether or not this might actually be really bad in the eyes of someone who has been educated by a professional.
I as an example also don't follow any writing guidelines which are usually present in the code of someone who has had a teacher, but this comes from me simply thinking they are unnecessary if the code is very easy to read (By choosing smart variable names, excluding prefixes and I don't even know if "prefixes" is the right word).
There are also self-taught programmers who don't struggle with this kind of insecurity at all and just think they are fine as long as it works, but I believe that this is just pure ignorance rather than optimism.