You are missing my point. Python may be slower than some other languages in terms of performance, but when running on modern hardware, this difference is miniscule, and most noticeable slow-downs and performance issues stem from badly written code and algorithms rather than the language.
Bro, check out
my post above with actual benchmarks that you so conveniently seem to totally ignore. You say Python "may be slower" and "difference is miniscule" -- it is *dog slow*, one of the slowest general purpose languages (maybe even slower than Ruby) and the difference is *not* minuscule!
These benchmarks mean nothing without context. As a simple analogy, a Lamborghini might be 3 times faster than the average car, but police departments everywhere except Dubai might be totally ok with having Dodge Chargers or whatever as their highway cruiser, because a Charger is good enough to run down 99% of civilian cars. Same thing here, it doesn't matter how much faster C is than Python, Python is still good enough performance-wise (if coded intelligently) for the vast majority of real world business applications. If you happen to run across the few applications that this is not good enough for, you can always swap out Python in the critical code segments for another language (easily integratable with Python), or just use a different language completely.
and most noticeable slow-downs and performance issues stem from badly written code and algorithms rather than the language.
Please, check out that benchmark above. The CPython version runs at 0.004% of the performance of the C++/Nim version. I have the code in a GitHub repo, play around with the Python version if you want, and if you can optimise my "badly written code and algorithm" to say, reach just 10% percent of the C++ version while you retain the same functinality (I'm not even setting the bar high here), I'll buy you a Ferrari!
You seem to be repeating the "optimise the algorithm" mantra, which is not untrue, but it can lead to false thinking, like in your case. The algorithm I'm using is *already* optimal, but the choice of *language* accounts for the ~250-fold difference in performance (it's the exact same algorithm!) This is where the whole "optimise the algorithm" becomes a fallacy, while ironically it's a true statement at its core. You can achieve 2-50x speedups even when optimising C++ code *purely at the code level* (!) if you know what you're doing, while the algorithm is completely unchanged. Then certain types of optimisations are just not possible in Python or similar dynamic languages, therefore 100-200 fold speedups are expected and quite unsuprising when porting calculation intensive stuff from Python to something else.
This has nothing to do with the argument at hand. It's like gamers running benchmarks to test their hardware to its limit, then going back to play Minecraft, Battle Brothers, Unity games, and ASCII Roguelikes on it. Most business applications do not require these optimizations that you are talking about, so it's fairly irrelevant to most programming jobs. Obviously if you work in a field that needs those kinds of optimizations, you should choose a language that works well with them.
On the flip side, Python lets programmers be highly productive and produce programs at a high clip, and programmers are the most expensive resource in development. So less programmer man-hours (higher productivity) will always win out financially over slightly higher hardware costs, which is exactly why you see so many high end startups and hi tech companies use Python.
That's true and I agree with that general idea, but it still doesn't make Python a good choice. Maybe in web development where you're not doing anything performance intensive Python is okay because the network and the database is the bottleneck. But if you start doing anything more performance intensive, say a chat server with a few millions of online users, or some high performance web server with high concurrency, things start falling apart.
Have Intagram and Pinterest and Netflix fallen apart?
Like I said, Python is (or rather was, as there are better options now) an okay glue language;
I don't even understand what you mean by glue language. All main programming languages are glue languages in modern software: in a typical system you might have business logic in the main programming language, DB related stuff in SQL or an equivalent, front-end UI stuff in JavaScript, some other specialized language for a niche area like AI, etc. So the main language will always be the glue that connects these different parts of the system.
It's actually not a contradiction at all, and note the emphasis I made in your statement. What happens quite often is that some guys have an idea for some website or web service, then hack something quickly together in Python, Ruby, NodeJS or whatever. 99% of these companies die within a year or two, but the 1% that survives and gets really popular, well, they'll start experiencing scaling and performance issues as the traffic of their website grows. So time to re-architecture things from the ground up; this is when the original "founder code" written in Ruby, Python, etc. gets tossed in the trash, and everything gets rewritten in Java/Scala/Kotlin or whatever else that is much more robust.
Source: I've worked at one such place that made the Ruby -> Scala transition (because of performance and maintainability issues). To be fair, it's almost always not just about rewriting the same thing in a different language, but also re-architecting the whole system group the ground up. But it's certainly true that sloppy languages lead to sloppy practices.
I think you are looking at it from your own biased view. You believe that the switch was made due to performance issues with Python or because it doesn't scale. I, on the other hand, believe what happens in these cases (which are not guaranteed btw, some companies continue running on Python just fine at huge scale, see my examples before) is that as the company grows significantly, they have to increase their programming corps by a large degree. So they go from a few brilliant programmers to large teams of average/young/outsourced programmers. And with the latter, as I stated before, you definitely want a stricter, more idiot-proof language like Java/C#.
You are missing my point. Python may be slower than some other languages in terms of performance, but when running on modern hardware, this difference is miniscule
This isn't true for many types of programs, most notably games. I don't actually know where people get this "modern computers are fast" meme, but it just isn't true. It's easy to do something like write a slow algorithm that totally destroys the cpu, well at least one of the cores.
1. You are conflating algorithm slow with language slow. Two completely different things. Language related performance is generally not a big deal today because it just adds some additional overhead to running code. For example, compiled languages like C++ are compiled into 0s and 1s, and that is what's run on the computer. So the code runs natively and fairly quickly. Interpreted languages like Python are run as non-native code on a virtual machine called the interpreter, which converts them into 0s and 1s at run-time. Obviously this adds overhead to every instruction, but it's fairly fixed overhead, so when modern hardware can run some obscene number of operations per second, increasing this number by some fixed amount is not a huge deal for most business software.
Bad algorithms or code, on the other hand, can increase the performance hit exponentially, which no hardware will help you with. Things like nested loops, scanning years worth of text logs letter by letter without an index, the n+1 problem in db queries, etc. For large enough sets of data, these kinds of errors can literally increase executions times to cosmic scales.
2. I keep repeating this over and over, most of you are fixated on game programming, but game programming is a small sub-field in the entire field of programming, and the rest of it has much lower needs for performance.
You keep bringing up this TIOBE index, of which I've never heard about before, so ok, I took the bait and looked at it. So let's see, C is #2, Visual Basic at #6 is ahead of JavaScript at #7, and Assembly Language is ahead of SQL and just behind JavaScript. Who is trolling now?
Like do you seriously believe there are more jobs for C and Visual Basic and Assembly programmers today than for JavaScript?
I don't have to believe in anything because there is such thing as TIOBE Index where I can check their methodology of scoring and decide if such source is legit or not. And IMHO it's legit. C and ASM being more popular then JS is the most obvious one, thanks to IoT alone, embedded systems, hardware drivers and such. I have no clue who is using VB, I'm not going to theorize on it.
Maybe there are no jobs in your area for these languages, but your area is not a whole world. Don't be ignorant.
Sorry to break it to you bre, but that index is bs. If you really believe there is more demand for VB and assembly language than JS, I dunno what to tell ya.
According to that page Pinterest is the only one that uses Python almost exclusively. Other websites use a combination of several languages and therefore Python is likely used as a glue code, the only thing it is capable of.
Quora is not mentioned in that page at all. I never said it does not use Python, because it is widely known that it does.
Uber is known to be an early adopter of Node.js and later Go. What is your basis for claiming that Uber uses Python?
All modern system use combinations of languages, so I don't get your point. If by glue code you mean the main programming language that connects all the other languages, ok.
Uber:
https://marutitech.com/build-an-app-like-uber/
"What programming language does Uber use?
Uber’s engineers primarily write in Python, Node.js, Go, and Java. They started with two main languages: Node.js for the Marketplace team, and Python for everyone else."