I think at the end of the day, the question is whether a compiler for this type of language could efficiently handle a function like distinct-sorted:
> (distinct-sorted '(foo bar foo baz))
(bar baz foo)
This is a function that usually requires efficient hash tables and arrays to be performant, a hash table for detecting the duplicates, an array for efficient sorting. However, both the hash map and array could theoretically be "optimized away", since they are not exposed as part of the output.
A language like Bel that does not have native hash maps or arrays and instead uses association lists would have to rely entirely on the compiler to find and perform these optimizations to be considered a usable tool.
I'm no language expert, but the there things I can think of that make bel impractical without major compiler trickery are (1) lack of primitive hash tables (2) lack of primitive arrays (3) no support for tail call optimization (though that third thing is probably fixable with the right compiler tricks)
The other concern I have is the lack of a literal associative data structure syntax (like curly braces in clojure) It seems that would negatively impact pg's goal of "code simplicity" quite a bit.
(1) lack of primitive hash tables (2) lack of primitive arrays
I'll note that if there are primitive arrays and the compiler optimizes arithmetic, the rest of the hash table can be implemented in Bel.
Also, maybe a Sufficiently Smart Compiler could prove that a list's cdrs will never change, store it in cdr-coded form, and treat it like an array (with the ability to zoom right to, say, element 74087 without chasing a bunch of pointers).
A language like Bel that does not have native hash maps or arrays and instead uses association lists would have to rely entirely on the compiler to find and perform these optimizations to be considered a usable tool.