# UMD Analysis Qualifying Exam/Jan11 Real

 Let ${\displaystyle f\in AC[0,1]}$ be an absolutely continuous function on [0,1] with ${\displaystyle f>0}$. Prove that ${\displaystyle 1/f\in AC[0,1]}$.
Since ${\displaystyle f}$ is (absolutely) continuous on [0,1] with ${\displaystyle f>0}$ then there exists some ${\displaystyle 0.
Since ${\displaystyle f\in AC[0,1]}$ then for any ${\displaystyle \epsilon >0}$ there exists some ${\displaystyle \delta >0}$ such that for any finite collection of disjoint intervals ${\displaystyle I_{k}=(x_{k},y_{k}),k=1,...,n}$ such that if ${\displaystyle \sum _{k=1}^{n}|y_{k}-x_{k}|<\delta }$ then ${\displaystyle \sum _{k=1}^{n}|f(y_{k})-f(x_{k})|<\epsilon m^{2}}$.
Then for any such collection of intervals described above, we have ${\displaystyle \sum _{k=1}^{n}|{\frac {1}{f(y_{k})}}-{\frac {1}{f(x_{k})}}=\sum _{k=1}^{n}|{\frac {f(x_{k})-f(y_{k})}{f(y_{k})f(x_{k})}}|\leq \sum _{k=1}^{n}{\frac {1}{m^{2}}}|f(y_{k})-f(x_{k})|<\epsilon }$.