Exactly this. People who are new to programming -- and aren't, say, majoring in math -- don't understand or appreciate determinism and exactness. That is, if their "program doesn't work", and unless you intervene as the instructor, they'll try every random thing they find on StackOverflow until their program "works" (i.e. no longer raises an Exception).
I love the elegance possible in Ruby syntax, but it allows for way too much ambiguity, and if you don't understand how parsing/interpreters fundamentally work, it will seem like Ruby is as loose and permissable as regular English syntax, which is a huge stumbling block for people entirely new to programming. I admit to slamming my fist on my desk the first time I tried writing Python only to get an indentation error, but Python's explicitness is incredibly helpful in making clear the exactness needed for computation.
And Python's design helps prevent many of the kinds of catastrophic/difficult to debug errors that do not throw exceptions in Ruby. For example, the following situations are acceptable in Ruby, but in Python, will raise errors:
if x = 9
puts x + 1 # prints 10, because x is assigned to 9 first
y = {}
z = y["this doesn't exist"] # z is set to nil
Yes and no. Consider that these are different in Ruby but the same in almost every other language:
Ever had to debug code that relied on this behaviour? Or had to explain why exactly to someone unfamiliar with yacc?