Japan are Women World Cup champions

  • 12 years ago
Japan become the first Asian nation to win the women's World Cup after a thrilling final against the USA in Frankfurt.

Al Jazeera's Nick Spicer has this report from the German city of Frankfurt.

Recommended