We study L2-approximation of functions from Hilbert spaces H in which function evaluation is a continuous linear functional, using function values as information. Under certain assumptions on H, we prove that the n-th minimal worst-case error en satisfies en≲an/log(n), where an is the n-th minimal worst-case error for algorithms using arbitrary linear information, i.e., the n-th approximation number. Our result applies, in particular, to Sobolev spaces with dominating mixed smoothness H=Hsmix(Td) with s>1/2 and we obtain en≲n−slogsd(n). This improves upon previous bounds whenever d>2s+1.