You should open up a history book friend. It could help exercise the mind.
Germany declared war on the United States first. You tend to risk an invasion when you physically tell the other side "Come at us, bro!".
The United States was considered neutral up until that point and wanted to avoid becoming entangled in another European conflict like WW1. There was a strong isolationist attitude within the United States that lobbied to keep Americans out of the war, but that all changed when Japan, who had a treaty with Germany, attacked Pearl Harbor on Dec 8th, 1941. Four days later, Germany declares war on the United States.
Some would argue that Germany had just cause to declare war on the United States, citing the provocation of the American administration covertly working with the British government to aid in the defense of the United Kingdom by providing aid under the table and attacking German shipping and aircraft (allegedly in self-defense).
Others will state that Hitler was spoiling for a fight long before then, having greatly miscalculated the industrial capacity of the United States, and found the motivation to declare war after being inspired by the attack on Pearl Harbor.