If you want baroque, look at Perl.
I have, and I want to claw my eyes out. That is hands-down the most stupidly designed programming language ever invented. I have had to deal with many Perl code bases, all of it atrociously unmaintainable. And all of it apparently written according to what seems to be the Perl programming community's belief that if code seems incomprehensible then you just don't know enough Perl. If I can get the customer's permission, I'll post some for you to eviscerate. I'm convinced that no one should
ever use Perl for commercial software development.
As for Python...I find the syntax to be quite awful, beginning but not ending with the indentation significance and lack of visible delimiters for code.
I originally considered those to be weaknesses too before I started programming using it, and I think to some extent they are. But keep in mind that I started with assembler and Fortran, so horizontal spacing is something I've always associated with syntax. So they really don't bother me that much. In practice we use both horizontal spacing and delimiter characters e.g., { } in all our code, so Python's designers seem to have just bit the bullet and used that to group instructions.
I generally use C on embedded hardware, C++ for most things on the desktop, and Ruby for various scripting tasks and programming stuff that doesn't need high performance.
Tomorrow I need to learn Ruby. Any good references? For high-performance new stuff I use C/C++, but I still have to deal with legacy Fortran code bases. Sadly I don't see much need for compiled languages on the desktop. If I'm paying for the development I'd rather just buy eight more cores and 32 gigabytes more memory with the money I'd have spent on programmer time to debug the memory mishandling. Don't get me wrong, I understand very well how to debug compiled code memory leaks. But it seems that's a dying art, and only the very highest performance code needs to do its own memory management these days.
Try HAL/S sometime, especially the algebraic notation. It's trippy.
...rather misguided decision to use 1-based indices for arrays.
* JayUtah facepalms.
At the same time, I think it's pretentious of some computer science authors to start their texts at Chapter 0.