squid proxy: character failure on chinese web sites - character-encoding

I set up a squid 2.7 on my windows 7 64bit Notebook in transparent mode. Other Notebooks can access the internet over the proxy on my notebook, but over the proxy the webpages are not displaying the chinese character signs the way they should.
I found something like -*- coding: utf-8 -*- to write into the first line, but it didn't help.
The error with the broken characters occours from the first hit.
The Webpages are encoded in UTF-8 and delivered from the proxy in UTF-8.
EDIT: I found out that the encoding Problem only occurs if I open a Webpage on an other Notebook that is connected over my PC. If I use the Proxy locally on my PC the character encoding is fine. Russian Website dont have the Problem either, its only chinese and japanese.
Any solutions to this problem?
THX a lot.
Mark

Finally I found out that the damned Notebook which showed the broken characters didnt have the eastern language support installed. Arrggg. Everything is fine with the proxy. Spend hours of my life for such a crappy mistake.

Related

Unicode characters incorrect or missing with Puppeteer within Docker

When I use puppeteer locally and produce a pdf, stock settings and headless, this character
🌐
is drawn correctly on the resulting pdf. When I try to run the same within Docker, the character renders incorrectly. This happens with many unicode chars within Docker, they render incorrectly or not at all.
I've read the posts about this, and it seems like it's a font problem? I have puppeteer running within docker, using node:16 as a base. Is it just that my base image doesn't have enough fonts? I've tried installing chrome and the other suggestions from the puppeteer docker file and related posts, they don't work. Using puppeteer 18.0.2 on node:16.
If it's a font issue, does somebody know how to get fonts installed that will work and produce the characters generally found in the headed browsers? Or is it something else? Ok, just posting this in 2022 since many of the answers from the past are old and outdated. Thanks for any help.
Try using the official Docker image that pre-installs many fonts: https://github.com/puppeteer/puppeteer/blob/main/docker/Dockerfile#L11

Arabic Encoding by Quickbooks Desktop not readable

We have an application that reads from #Quickbooks desktop 2012 - which reads/write arabic - if you set Windows Regional Settings to Arabic. However, when we try to read the data from via Quickbooks SDK its returned as garbage like this "äÈÞ áÇÏÇÑÉ ÇáãäÊÌÚÇÊ ÇáÓíÇÍíÉ"
How to decode the above encoding ? we tried so many decoders but none could revert back the original text which is readable by Quickbooks Desktop UI
Appreciate your help

YAWS crash uploading 7 MB file

I'm using a Raspberry like board with YAWS 2.0.4 and Erlang 19.
I wrote two webpages to upload a file and save it on the server: with "larger" file (I mean, ~7MB) the server crashes, with smaller file all works fine.
I already tried to use the example code found in the YAWS site and another one with the temp_file and binary options, but it doesn't work.
Any suggestions?
Thanks in advance.
After spending much time, I've found the problem: the partial_post_size parameter in the YAWS configuration was too much high.
I've changed it, near the default value (10240) and all works fine.

Chinese and Japanese character encoding issues when exporting HTML to PDF

I run a web-based timeline maker that lets users create timelines in HTML/JavaScript and then export them to PDF files for printing when they're done.
I have had several users report issues with exporting their timelines to PDFs when the timelines contain certain Unicode characters. Here, for example, is a screenshot showing the web page and the PDF file that is generated:
I've been trying to wrap my head around why some Unicode character blocks like Block Elements and Georgian will export but Chinese and Japanese will not. Also, the export works correctly when I perform it on my local computer, but results in the above output when exporting on Heroku.
Does anyone know what might be causing this?
For completeness, the backend is in Ruby on Rails and it uses the PDFKit gem to convert the HTML page to a PDF and the site is hosted on Heroku.
It sounds like it might be an issue with the fonts on the server. The webpage version of the timeline renders correctly because you obviously have the correct font on the client machine that is running the browser. The PDF on the other hand is generated on the server, and thus has to use a font available to it there.
If that's the case, then using a font that both exists on the server and supports the correct CJK characters should fix this issue.
Having personally experienced this with Rails and Heroku, I can tell you the reason is either (A) fonts on your system not matching the fonts on Heroku, or (B) pdfkit having trouble loading custom fonts linked through CSS, or some combination of both
Most likely, you are referencing fonts on your local system (which contain glyphs for special characters) that don't match the fonts on Heroku. Run fc-list in Heroku's bash to get a list of their installed fonts, and substitute your font(s) for one that has the needed extended charset. However, now you will have to ensure that this font is also installed on your local machine. (Even worse, you could use different fonts for dev and production.)
You can also try uploading fonts to Heroku, and linking them from there. However, I've found this method to be unreliable when spanning across multiple systems or dev/staging/production environments, because each and every system has to have the required fonts installed. And even then, PDFkit makes you jump through hoops to get CSS fonts to work (for example, because of subtle variation in interpretation of font names by different operating systems).
The best solution I've found is to encode and embed fonts directly into CSS. Base-64 encode a font, and add it to the stylesheet:
#font-face {
font-family: 'OpenSans';
src: url(data:font/truetype;charset=utf-8;base64,AAEAAAATAQA...
}
Now you have a bulletproof stylesheet that's portable and self-compatible with every system.
If you do use Docker and is having the same issue above then try installing Japanese fonts at Docker: apt-get install fonts-takao-mincho
If it works then add it to your Dockerfile:
apt update && apt install -y \
# japanese fonts
fonts-takao-mincho

Line Character / Default Line Endings in Coda 2

I am having some issues with Coda 2. I work in a hybrid Windows/Mac environment and I am on a mac running Coda 2 to do my development work. After hitting Command , I set my default File Encoding to Unicode (UTF-8) and my Default Line Endings to Windows. Yet if you open one of my files in Notepad++ on a Windows environment it says that the file was saved with UNIX end-of-line characters.
Any ideas on how to get around this? I have done some research online and everybody seems to think that once these options are set that I should be fine, but it's not...I am running Coda 2.0.9
It appears that the only time this happen is when I send an HTML file via email. If I sent that file as a zip file it is fine, or if I load it to an FTP location it's fine. Not coda's fault at all.

Resources