Thing is, every browser, the big ones, like Chrome, Firefox, InternetExplorer, as well as the small ones, like Lync, Midori, etc. all have to be able to display your site. So, people went ahead and thought about it. What kind of information should be important enough so that every browser should be able to handle it? And those smart people decided, that HTML (not Jade or JSON or...) and CSS (not Sass, Less, etc.) and JS (not Ruby, C#, etc) should be that minimum standard which has to be supported by all browsers so a website can be displayed correctly. While that is one decision on standards, no one hinders you to use a different language for the web. Go ahead and use C#! But you will see, that unless you get C# standardized, normale people with mainstream browsers will not be able to execute your scripts.
Standardizing a new language is a very painful procedure. It includes explaining, why the language in question is better suited to the web than JavaScript. Is it because you prefer that language? Then screw you! No one cares about your opinion! That might sound harsh, but the only thing which is relevant to the world is facts. If you present facts to W3C which proof that your language is good enough for the web, they might accept it (a thing at which Google and Microsoft and many others failed...).
So, the thing is: JavaScript was chosen as a standard, because of reasons. You either accept the standard and build a site which everyone can view, or you do lots of research and explain to the world, why an interpreter for a different language should be included in every major browser. That's quite hard, and you will have to proof definite shortcomings of JavaScript , or rather ECMAScript (which cannot be overcome by a standard update, like ES8), which make the existence of a different programming language in a browser plausible for normal conditions (no special cases allowed).
j
stuff ;)