When Hitler first came to power in 1932, he was not immediately seen as a threat to surrounding countries or the United States. In fact his first few years in power seemed to be a big improvement over the Weimar Republic.
It wasn't until his unprovoked invasion of Poland in 1939 that the world took notice of Hitler's imperial designs. That was not coincidentally about the same time Hollywood began cranking out anti-Hitler propaganda.