QUIZ: How Much Do You Know About Florida, The Sunshine State?
In which year did Florida officially become a state?
Retry
Correct
Incorrect
On March 3, 1845, the United States officially admitted the former Florida Territory into the union. Florida Territory had been an organized, unincorporated territory of the country beginning in 1822.
In which year did Florida officially become a state?
- 1867
- 1905
- 1845
- 1812
Scroll down to continue on!