A Markov chain arising from Gaussian elimination

Consider a Markov chain on the set GL_n(F_2) of n \times n invertible matrices mod 2, where in each step, two distinct rows i,j are chosen uniformly at random, and   row i  is added to row j mod 2. How does the mixing time grow with n?  The best upper bound known is O(n^3) and the lower bound is of order n^2/(\log n)

It is known that the mixing time for one column of the matrix is of order n \log n, and in fact there is cutoff.

R. Rivest asked for the mixing time if only test functions that are computable in polynomial time are allowed. 

REFERENCES

Send me questions about this open problem