Something I was taught VERY early on in programming was the mantra "the less code you write, the less there is to break". It's a bit trite, but it's also quite true.
As such the scale of the project -- and how much actual effort is willing to be put into it -- is the deciding factor.
The more code you blindly trust from other sources -- like frameworks -- the MORE likely you are to have things go wrong. The more you rely on things that let you avoid learning the underlying languages and technologies, the less capable and competent you will be at fixing things when they do go wrong.
... which is why all the sleazy "shortcuts" making wild unfounded claims about being somehow magically "easier" or "simpler" often leave beginners and ALLEGED experts alike completely hobbled when things break. That fundamental lack of understanding that drove them to the off the shelf answers in the first place is precisely what can cause bugs, EVEN when the code they are recycling or trusting had no bugs out of the box.
Of course it also depends on how you define a bug, or as Mozilla calls them "features".
The choice of programming language can also have a huge impact -- see security flaws and logic bugs where languages like Ada were carefully crafted so that such things simply would not and could not make it past the compiler.
Which is why the US fed ditching Ada to allow in C++ and other languages was 100% grade A herpa-freaking-derp.