well thats probably just tradition. Football has been the american way of "soccer". Nowadays americans are (maybe a bit too much) proud on their traditional football, because is "manly". You know, football does combine all evolutioary manly instincts like protecting, hunting down etc.
Americans often blame the game of football for becoming to weak with all of the rules that try to protect players health. They feel like their game gets destroyed because players arent alowed to basically kill each other on the pitch. Younger americans might have a different view and everyone who ever actually played the game of football does aswell, not gonna lie.