> > Thanks Michal. But there is still an issue. This is what I tried: > > Using \usepackage{upquote}, does indeed correct the problem for tex4ht, > but _only_ for the verbatim text in the above example, not for > the normal text. > > Yes, the normal text, appears the same in the pdf as it is on the > web page, but the encoding can't be the same. I found this, when I > copied the normal text out from pdf to text file and looked at > the hex encoding using > >> xxd -p foo.txt >
This is maybe caused by PDF viewer you use, I don't get graves using Acrobat Reader or pdftotext. Grave character is used to input quotes, ie. ``hello'' will print correct English quotes, so it would be error to get anything else in a text. You can use \`{} command to get grave, or better \newcommand\textgrave{\`{}} and then use \textgrave in the document. this works in pdflatex as well as in tex4ht. > > It was the hex60, which is what I wanted, same as the input. > > But when I copied the normal text from the web page, and looked at > its hex encoding, it was the left single quotation mark. which > causes problem. > > So, the encoding inside pdf can't be the same as the HTML generated for > the normal text. Even though they do appear to be the same (left single > quotation) when looking at them on the screen. > > For pdf, I did not even need the \usepackage{upquote}, and was able > to copy both the normal and the verbatim text, and they both came out > as grave accent. > But for htlatex, it did fix the verbatim part. Not the normal > text part. This was the same result as when using the patched > cmtt.htf I was testing with. > > So, there is still a problem, with normal text. For now, I will use > verbatim with \usepackage{upquote} to avoid this problem. But for > normal text, I think there is still a problem, since it does not > work like with pdflatex or lualatex. > > Thanks for your help. > you're welcome :) Michal > --Nasser >