Game theory: Are there any other equilibria in this game?

I took the following game from the Peter Winkler collection (chapter "Games"):

Two numbers are chosen independently at random from the uniform distribution on [0,1]. Player A then looks at the numbers. She must decide which one of them to show to player B, who upon seeing it, guesses whether it's the larger or smaller of the two. If he guesses right, B wins, otherwise A wins. **Payoff to a player is his/her winning probability**.

One easily identifies the following **mixed strategy Nash equilibrium**:

"Player A shows the larger number with prob 1/2 and player B guesses 'larger' with prob 1/2"

The book also suggests a smart pure strategy for A, which is in effect identical to her mixed strategy above (in the sense that locks her winning prob at 1/2 regardless of B's strategy):

"Player A shows the number which is closer to 1/2"

A little thought shows that B also has a pure strategy in like manner:

"Player B guesses larger iff the number he sees exceeds 1/2"

Together these two strategies form a **pure strategy Nash equilibrium**.

-------------------------------------------------------------------------------------------------------------------

To be clear, let me define a pure strategy for B as a function $\displaystyle {f}_{B}:[0,1]\longmapsto \{larger, smaller\}$, i.e., he assigns "larger" or "smaller" to every real in $\displaystyle [0,1]$.

Similarly, A's pure strategy is a function $\displaystyle {f}_{A}(\{x,y\})=x\ or\ y$, i.e, she assigns $\displaystyle x$ or $\displaystyle y$ to every set $\displaystyle \{x,y\}$ where $\displaystyle x,y \in [0,1]$.

**My question** is: Is the above pure strategy Nash equilibrium unique? Given the above definition, there are infinite pure strategies for each player. Could other less obvious or highly artificial equilibria be constructed (except equilibria that differ only on a measure zero set)? How can we prove or disprove the uniqueness of equilibrium from those infinite strategies?