Given an unknown discrete distribution, classical algorithms for estimating its Shannon and Renyi entropies have been intensively developed and tight bounds of queries to the distribution have been established. However, the quantum query complexity of entropy estimation is still open in general. We give quantum algorithms that give quadratic speed-up for Shannon entropy estimation compared to its classical lower bound and also speed-up for \alpha-Renyi entropy estimation when 1/2<\alpha<2. Our approach is inspired by Bravyi, Harrow, and Hassidim, but we have two new ingredients in our algorithm design and analysis: one lies in a more careful analysis of the output distribution of the quantum counting primitive, the other is to leverage a generic quantum speed-up of estimating the expectation of the output of any quantum procedure by Montanaro, which significantly improves our error dependence. Our approach can be extended to test the Kullback-Leibler (KL) divergence between distributions with quadratic speed-up in the size of the distributions' domain. We also obtain quantum lower bounds of Renyi entropy estimation.