These might be theoretical issues that people without experience worry about, but let me share what I've witnessed in practice working almost a decade with Clojure at Nu.
We mostly hired people with no previous Clojure experience. Majority of hires could pick up and get productive quickly. People fresh out of college picked it up faster. I even had a case of employee transitioning careers to S.E., with no previous programming experience, and the language was a non issue.
I can't remember an instance where the language was a barrier to ship something. Due to reduced syntax surface and lack of exotic features, the very large codebase followed the same basic idioms. It was often easy to dive into any part of the codebase and contribute. Due to the focus on data structures and REPL, understanding the codebase was simply a process of running parts of a program, inspecting its state, making a change, and repeat. Following this process naturally lead to having a good test suite, and we would rely on that.
Running on the JVM is the opposite of a problem. Being able to leverage the extensive JVM ecosystem is an enormous advantage for any real business, and the runtime performance itself is top tier and always improving.
The only hurdle I could say I observed in practice was not having a lot of compile time guarantees, but since it was a large codebase anyway, static guarantees would only matter in a local context, and we had our own solution to check types against service boundaries, so in the end it would've been a small gain regardless.
Clojure is easily the most boring, stable language ecosystem I’ve used. The core team is obsessed with the stability of the language, often to the detriment of other language values.
This attitude also exists among library authors to a significant degree. There is a lot of old Clojure code out there that just runs, with no tweaks needed regardless of language version.
Also, you have access to tons of battle tested Java libraries, and the JVM itself is super stable now.
I won’t comment on or argue with your other points, but Clojure has been stable and boring for more than a decade now, in my experience.
What I meant by that is the metaprogramming capabilities that often get cited for allowing devs to create their own domain specific "mini languages". To me that's a "creative" way to write code because the end result could be wildly different depending on who's doing the writing. And creativity invites over-engineering, over-abstraction, and hidden costs. That's what I meant by the "opposite of boring".
You linked me to this comment from another one and I have to agree with this sentiment.
Creating these mini DSLs is something that requires a lot of thought and good design. There is a danger here as you pointed out sharply.
But I have some caveats and counter examples:
I would say the danger is greater when using macros and far less dangerous when using data DSLs. The Clojure community has been moving towards the latter since a while.
There are some _very good_ examples of (data-) DSLs provided by libraries, such as hiccup (and derived libraries), reitit, malli, honeysql, core match, spec and the datalog flavor of Clojure come to mind immediately (there are more that I forget).
In many cases they can even improve performance, because they can optimize what you put into them behind the scenes.
In practice, though, most developers don’t do that.
There’s a rule of thumb: write a macro as a last resort.
It’s not hard to stick to it. In general, you can go a long, long way with HOFs, transducers, and standard macros before a hand-rolled macro would serve you better.
> syntax is hard to read unless you spend a lot time getting used to it
That’s pretty much exactly the opposite of how I always felt. Perhaps because I’m not a programmer by education, I always struggle to remember the syntax of programming languages, unless I’m working in them all the time. After I return to a language after working in other languages for a while, I always have difficulties remembering the syntax, and I spend some time feeling very frustrated.
Clojure and Lisps more generally are the exception. There is very little syntax, and therefore nothing to remember. I can pick it up and feel at home immediately, no matter how long I’ve been away from the language.
I don't think the syntax is hard to read in any kind of objective sense, it's just different than most mainstream languages. Greek would be hard for me to read too, but that's not because it's necessarily harder to read than English, just that I don't really know Greek.
I agree with the short variable name convention, that's annoying and I wish people would stop that.
Everyone complains about a lack of type safety, but honestly I really just don't find that that is as much of an issue as people say it is. I dunno, I guess I feel like for the things I write in Clojure, type issues manifest pretty early and don't really affect production systems.
The clearest use-case I have for Clojure is how much easier it is to get correct concurrent software while still being able to use your Java libraries. The data structures being persistent gives you a lot of thread safety for free, but core.async can be a really nice way to wrangle together tasks, atoms are great for simple shared memory, and for complicated shared memory you have Haskell-style STM available. I don't remember the last time I had to reach for a raw mutex in Clojure.
Good concurrency constructs is actually how I found Clojure; I was looking for a competent port of Go-style concurrency on the JVM and I saw people raving about core.async, in addition to the lovely persistent maps, and immediately fell in love with the language.
Also, I really don't think the JVM is a downside; everyone hates on Java but the fact that you can still import any Java library means you're never blocked on language support. Additionally, if you're willing to use GraalVM, you can get native AOT executables that launch quickly (though you admittedly might need to do a bit of forward-declaration of reflection to get it working).
The JVM is one of the major selling points of Clojure. You can "write once, run anywhere" and benefit from Java's massive ecosystem, all without having to use a Blub language. Modern JVM implementations are also incredibly fast, often comparable in performance to C++ and Go.
It's almost as if different tools exist for solving different problems. Clojure is "Lisp on the JVM". That's the core premise behind the language. Rust is a "systems programming language with a focus on type and memory safety". This is an apples-to-oranges comparison. They offer different benefits while providing different drawbacks in return. Their ecosystems are likewise very different, in each case more closely tailored to their particular niche.
> i don't think you're wrong necessarily...but rust, golang, zig, mojo, etc are gaining popularity and imo they wouldn't be if they were JVM languages.
> understood, i'm just pointing out that people seem to prefer the apple over the orange.
This is kind of like saying that fewer people are drinking Coke every year and are choosing other beverages. It might be objectively true but it glosses over the fact that literally billions of people drink Coke daily and will continue to do so for decades to come.
The JVM is the same. Some people and organizations might be using zig or mojo (and I have absolutely nothing against zig or mojo, to be clear, I hope they succeed) but many multiple orders of magnitude more individuals and organizations run JVM stuff in a given year and will continue doing so.
At this point, the JVM is a civilizational technology. If it went away tomorrow, multiple banks would fail, entire countries would no longer be able to administer social services, millions of people would die. The JVM is in everything.
Developers on HN using zig, mojo, etc. aren't really a representative sample.
That's fair if you're looking at it from a performance perspective.
Not entirely fair if you look at it from a perspective of wanting fast feedback loops and correctness. In Clojure you get the former via the REPL workflow and the latter through various other means that in many cases go beyond what a typical type system provides.
> the opposite of boring
It's perhaps one of the most "boring in a good way" languages I ever used.
"The TIOBE index measures how many Internet pages exist for a particular programming language."
For some reason I doubt this is in any way representative of the real world. Scratch, which is a teaching language for children, bigger than PHP? Which is smaller than Rust? Yeah, these are results you get when you look at the Internet, alright.
Sure that index isn't great (I think it's basically a regurgitation of Google Trends), but I don't think you're suggesting Clojure is actually a popular language are you? Which is the only point I'm trying to make (that it isn't popular).
Clojure is reasonably popular as far as programming languages go. It's not difficult to get a job as a Clojure developer, particularly in certain sectors (fintech and healthcare are the heaviest Clojure users). Of course C++, Java, C# and PHP dwarf both Clojure and Rust by several orders of magnitude.
If anything, I think that makes Clojure better. Almost no one in the community is doing stuff to serve "lowest common denominator", compared to how most of JS/TS development is being done, which is a breeze of fresh air for more senior programmers.
Besides, the community and ecosystem is large enough that there are multiple online spaces for you to get help, and personally I've been a "professional" (employed + freelancing) Clojure/Script developer for close to 7 years now, never had any issues finding new gigs or positions, also never had issues hiring for Clojure projects either.
Sometimes "big enough" is just that, big enough :)
Spinning dwindling adoption as a good thing because it "unburdens community from serving lowest common denominator use-cases" is exactly the kind of of downplaying/deflection of every issue that I'm talking about, which constantly happens in the Clojure community. It's such an unhealthy attitude to have as a community and it holds it back from actually clearly seeing what the issues are and coming up with solutions to them.
Every problem people face is "not a problem" or "actually a good thing" or, maybe if all else fails we can make users feel bad about themselves. Clojure is intended for "well experienced, very smart developers". Don't you know, our community skews towards very senior developers! So if you don't like something, maybe the problem is just that you're not well experienced enough? Or, maybe what you work on is just too low-brow for our very smart community!
> It's such an unhealthy attitude to have as a community
How about just "different"? Turtle want to teach everyone to program, that's fine, just another way of building and maintaining a language. Clojure is clearly not trying to cater to the "beginner programmer" crowd, and while you might see it as "unhealthy attitude", I'd personally much prefer to realize having many different languages for different people is way better than every language trying to do the same thing for the same people. Diversity in languages is a benefit in my eyes, rather than a bad thing.
I'm glad it works for you and many others and gives you a good living. Nothing wrong with that. I wasn't trying to attack it or anyone that uses it, just stating why I never warmed up to it and projecting why I think it hasn't become popular.
You also need to learn a new tool to write lisp, like paredit.
While it's amazing once you've learned it, and you're slurp/barfing while making huge structural edits to your code, it's a tall order.
I used Clojure for a long time, but I can't go back to dynamic typing. I cringe at the amount of time I spent walking through code with paper and pencil to track things like what are the exact keyvals in the maps that can reach this function that are solved with, say, `User = Guest | LoggedIn` + `LogIn(Guest, Password) -> LoggedIn | LogInError`.
Though I'm glad it exists for the people who prefer it.
i'm surprised anybody coming from clojure would say this.
you absolutely do NOT need to learn paredit to write lisp, any modern vim/emacs/vscode plugin will just handle parentheses for you automatically.
that said, if you do learn paredit style workflow - nobody in any language in any ide will come even close to how quickly you can manipulate the codebase.
I realize how lame it is to relitigate Clojure's downsides every time it comes up. I fell into the same trap that annoys me about Elm threads: people who haven't used it in a decade chiming in to remind everyone they didn't like some aspect of it. Wow, such contribution.
It's like seeing that a movie is playing at the theater so you show up only to sit down next to people to explain your qualms with it, lolz. Sometimes you need to let others enjoy the show.
The OP of this thread even said all that needed to be said "The learning curve is steep but very much worth it" yet we're trapped in this cycle because someone had to embellish it with a listicle.
I didn't realize it was a no-no to share opinions here.
Plus to be fair we're having this discussion in the context of an article from 2021 that just rose to front page of HN, only to repeat the same set of pros we've been hearing about Clojure for ages (code as data, repl, etc).
Ah, here we go again. Every single time Clojure gets mentioned on HN, some clueless egghead comes listing various "issues" without considering holistic, overall experience of using the language for real. Because they effing never did. Sure, it's so easy to "hypothesize" about deficiency of any given PL:
- Python: slow; GIL; dynamic; package management is shit; fractured ecosystem for a decade due to version split.
- Rust: borrow checker learning curve; compile times; half-baked async; too many string types; unreadable macros; constantly changing.
- Go: no generics for a decade, now bolted on awkwardly; noisy error handling; no sum types; no enums; hard to get right concurrency.
I can keep yapping about every single programming language like that. You can construct a scary-sounding wall of bullet points for literally anything, without ever capturing the cohesive experience of actually building something in the language. For all these reasons, programming in general could sound like a hard sell.
Stop treating Clojure like a "hypothetical" option. It doesn't need your approval to be successful - it already is. It's not going away whether you like it or not - despite your humble or otherwise IMOs and uneducated opinions. It's endorsed by the largest digital bank in the world, it scales to serious, regulated, high-stakes production systems. Not theoretically, not conceptually, not presumably - it has proven its worth and value over and over, in a diverse set of domains, in all sorts of situations, on different teams, dissimilar platforms. There are emerging use-cases for which there's simply no better alternative. While you've been debating whether to try it or not, people have been building tons of interesting and valuable things in it. Clojure is in no rush to be "sold" to you or anyone else. It's already selling like ice cream in July (on selected markets) and you just don't know it.
I am a Clojure fan and would love to use it. But you are right, we live in a real world where money talks and most organizations want to see developers as cheap, replaceable commodities.
Not to mention in a post AI world, cost of code generation is cheap, so orgs even need even fewer devs, combine all this with commonly used languages and frameworks and you need not worry about - "too valuable to replace or fire".
Having said that - there may be a (very) small percentage of orgs which care about people, code crafting and quality and may look at Clojure as a good option.
> - syntax is hard to read unless you spend a lot time getting used to it
This is only true if you assume C-like syntax is the "default."
But regardless of that, I'd argue that there's much less syntax to learn in LISPy languages. The core of it is really just one single syntactic concept.
But that's exactly the root of the complaint. Because there's (for the sake of argument) only one syntactic concept, there's no bandwidth for structural concepts to be visible in the syntax. If you're used to a wide variety of symbols carrying the structural meaning (and we're humans, we can cope with that) then `)))))))` has such low information density as to be a problematic road bump. It's not that the syntax is hard to learn, it's that everything else you need to build a program gets flattened and harder to understand as a result.
Even among lisps this has been problematic, you can look at common lisp's LOOP macro as an attempt to squeeze more structural meaning into a non-S-expression format.
To be pedantic, this isn't quite correct. Syntax isn't countable like that. What S-expressions are light on is production rules. At their most basic they have IIRC 7 production rules but there are absolutely 0 languages based on s-expressions which are that simple, since it doesn't give you anything like quasiquotes, vectors, Lisp 2 function resolution, etc. Reader macros make matters much worse.
What we can say is that they are constructively simple, but not particularly unique in that. Once you get into real sexpr languages they aren't simpler than horn clauses, and are constructively more complex than languages like Brainfuck and Forth.
It's repeated a lot because it's true. The collective developer world has decided that LISP syntax is not the preference. Good if you prefer it, but you're the in the overwhelming minority.
I think this is kind of misleading. Yes s-expressions have very simple syntax in and of themselves. But s-expressions are not all that's required to get all the control structures in Clojure. You need to memorize all the special forms and all the standard macros that people use on a day to day basis. And they're just as hard (actually IME harder) to memorize as any other syntax. let, cond, record, if, condp, let-if, fn, def, defn, loop, recur, if-some, when-let, for, ->, ->>, as->>, cond-> ...
To this day I have to look up whenever I get back into clojure what the "syntax" is of ns, require, import, etc.
i can link you similarly undecipherable walls of text in rust and zig and c
but i bet if you sat down a junior developer not yet entrenched in any style yet, they'd be able to grok lisp code MUCH faster than the intricacies of syntax of the other alternatives ¯\_(ツ)_/¯
Clojure on the JVM adds niche-language overhead to old deployment pain, so you get bleeding-edge bugs and mid-90s release rituals in the same stack. If you want to onboard new hires to Clojure expect to spend time on editor config and build tooling before they can even trust a stack trace. You still inherit Java's GC quirks without much type-driven tooling.
Typically you're either deploying via a container, in which case there's no more overhead than any other container deployment, or you're deploying directly to some Linux machine, in which case all you need is a JVM - hardly an arcane ritual.
People can invest in markets without a 401k with more options (plans commonly have only a handful of funds available) and less fees (both admin fees and inflated fund expense ratios). And you may pay more taxes with a 401k than otherwise depending on your future tax rate (which is unknowable).
The only pure advantage is employer matching if you have it and stay employed long enough for it to vest.
> It's infuriating. Nearly all of the agentic coding best practices are things that we should have just been doing all along
There's a good reason why we didn't though: because we didn't see any obvious value in it. So it felt like a waste of time. Now it feels like time well spent.
It's the user's fault. They vote for this crap with their attention. Junk sites like this shouldn't exist but they do amd aren't going anywhere until people stop using them.
Some users might enable these kind of features with their attention, but I don't think users actually want these features and any kind of "voting" is likely unintentional. It's manipulation. The fault lies mainly with the company and their carefully planned dark patterns. Ideally, users should punish them by e.g. leaving the platform but there's friction that may be a bigger problem than the dark patterns (depending on user). And I don't think there are any platforms that always guarantee good user experience now and in the future.
Not sure if users even realize what the dark patterns are and do. Users aren't all-knowing, with endless time, carefully balancing their attention to try to provide markets with the optimal signal to wisely guide the misbehaving actors.
Is it really the users fault when the apps are literally designed by neuroscientists that explicitly design it to be addictive toward humans all of which is being funded by monopolists companies whose leadership tend to have antidemocratic views about humanity?
Maybe we should finally regulate these addict boxes as the dangerous substances they are.
Users are not perfect agents. How can you expect the average non-technical person to figure out what is happening? For most people, if they don't see visually see something happening on the screen, it doesn't exist. They simply have no frame of reference to figure out that LinkedIN is hijacking their scroll speed.
> I do regularly read the code that Claude outputs
You probably could have s/Claude/Human/ in your rant and been just as accurate. I don't know how many times I've flagged these issues in code reviews. And that's only assuming the human even bothered to write tests...
What I find is that when I ask AI to write tests it writes too many, and I agree with you that a lot of them are useless. But then I just tell it that, and it agrees with me and cleans it up. Much faster feedback loop and much better final result.
I feel like people that look at a poor result and stop there and conclude it's useless have made up their mind and don't want to see the better results that are right in front of them if they just spend an extra 5 seconds trying.
How do you know whether the tests it’s spits out are bad if you don’t read the tests.
We’re not dealing AGI here. Tests aren’t strictly necessary for humans. They are for AI. AI requires guardrails to keep from spinning out. That’s essentially the entire premise of the agentic workflow.
I’m pretty sure they just meant they do testing not that they read the tests and that’s what everyone else who responded interpreted that as well.
You can get Claude to write good tests but based on what I’m seeing at work that’s not what’s happening. They always look plausible even when they’re wrong, so people either don’t read them, skim them very quickly, or read the first few assume the rest work and commit.
I think Claude is great for testing because setting test data and infrastructure is such a boring slog. But it almost always takes a lot of back and forth and careful handholding to get it right.
I read the tests, it also is really really good to have Claude verify that removing the changes in question break the tests. This brings the quality way way up for me.
> damage it will cause to the economy when you can no longer trust that you're on a video call with an actual person
What damage are you talking about?
I'm not sure I understand why it matters that there is no real person there if you can't actually tell the difference. You're just demonstrating that you don't actually need a human for whatever it is you're doing.
Your wife or mother calls you or video calls you and says to meet her somewhere, or to send money, or to pick up groceries or whatever. Does it not matter that it wasn't her? Could it be someone trying to manipulate you into going somewhere, to be robbed or whatever? At any rate, you'll need to verify that information came from the source you trust before you act on it, and that verification has a cost.
The damage is to the trust we have in our communication media. The conclusion here is that every person is trivial to impersonate; that's the damage.
Ok fine, let's put it in the context of business. Your competitor impersonates your customer, gives you bad instructions. After following the bad instructions, you lose the contract with your customer, and your competitor (the attacker) is free to try and replace you.
If you got a suspicious text, the logical thing is to call up the person who sent it and try to verify it. AI impersonation makes that much harder.
Or even better, open the on-prem AI portal and type something like "I just got a suspicious call from client X, but I am on a lunch break. Call him and use a fake video of me. Ask him if what he said is true..."
Because what you are actually doing is exchanging symbols, tokens, if you will, that may be redeemed in a future meatspace rendezvous for a good or service (e.g. a job, a parcel). These tokens are handshakes, contracts, video calls, etc. to be exchanged for the actual things merely represented therein.
Instead what we have now with AI is people exchanging merely the tokens and being contented with the symbol in-and-of itself, as something valuable in its own right, with no need for an actual candidate or physical product underlying the symbol.
There is a clip by McLuhan I can't be assed to find right now where he says eventually people will stop deriving pleasure from the products themselves and instead derive the feelings of (projected) accomplishment and pleasure from viewing advertisements about the product. The product itself becomes obsolete, for all you actually need to evoke the desired response is the advertisement, or the symbol.
A hiring manager interviewing an AI and offering it a job is like buying the advertisement you just watched, and.... that's it. No more, the transaction is complete.
>Instead of tending towards a vast Alexandrian library the world has become a computer, an electronic brain, exactly as an infantile piece of science fiction. And as our senses have gone outside us, Big Brother goes inside. So, unless aware of this dynamic, we shall at once move into a phase of panic terrors, exactly befitting a small world of tribal drums, total interdependence, and superimposed co-existence. [...] Terror is the normal state of any oral society, for in it everything affects everything all the time. [...] In our long striving to recover for the Western world a unity of sensibility and of thought and feeling we have no more been prepared to accept the tribal consequences of such unity than we were ready for the fragmentation of the human psyche by print culture.
The grandparent post has the belief that human interaction is intrinsically better. Not sure i agree, but i can understand the POV.
However, the increase in fake videos that are difficult to tell from real is indeed a potential issue. But the fact that misinformation today is already so prevalent is evidence that better video doesn't make it any worse than it already is imho.
You're not sure if human to human interaction is intrinsically more valuable than a human talking to a facsimile? That feels like a very dangerous position to hold for one's ethical calculations and general sanity. I'm clinging tightly to the value of the bond with other people, even the passing connection, but certainly with my family members as this article is about.
i much prefer using the ATM, self-checkouts and an e-commerce website, over having to talk to somebody at a branch to get money, buy my groceries, or booking a holiday.
Human to human may be more valuable, but that may not have much to do with the truth in their statements. For example if your relatives are hooked up to a constant misinformation feed it gets to become problematic to communicate and deal with them.
What I'm saying is that LLMs don't have to do truly novel work in order to be useful. They are useful because the lion's share of all work is a variation on an existing theme (even if the creator may not realize it).
The point is that saying the LLM failed to do what the overwhelming majority of devs can't do isn't exactly damning.
It's like Stephen King saying an AI generated novel isn't as good as his. Fine, but most of have much lesser ambitions than topping the work of the most successful people in the field.
- syntax is hard to read unless you spend a lot time getting used to it
- convention for short var names makes it even harder
- function definition order makes it even harder
- too dynamic for most people's taste
- no type safety
- the opposite of boring
- no clear use case to show it clearly beating other languages
- niche with small community and job market
- JVM
For all those reasons its a hard sell for most imo.
reply