Archived Forum Post

Index of archived forum posts

Question:

How to ensure SendBytes only sends one byte per character?

Feb 02 '15 at 11:20

Hi

When I used SendBytes to send some information based on ASCII codes it sent two bytes per character but I expected only one byte. So I changed the command to SendString and set the Character set to Windows-1252 and then only one byte per character was used. Can SendBytes be used to send data using one byte per character?

Thanks, Simon


Accepted Answer

My guess is that you are using the ActiveX and passing a string variable to SendBytes. What happens is that in any language using ActiveX, strings are passed as utf-16 (in something called a BSTR, which is "COM" terminology). What really happens is that the COM layer/functionality of your programming language is passing a Variant containing a BSTR to SendBytes. SendBytes always sends exactly the bytes you pass to it, and therefore it is sending the exact bytes of the BSTR.

What you should do instead is call SendString. This method pays attention to the StringCharset property, which you may set to "windows-1252". This allows for any programming language to call SendString and with the StringCharset property set to "windows-1252", you are assured that the string is sent using the Windows-1252 byte representation (1-byte per char). Some programming languages might pass strings using utf-16, others utf-8, and still others ANSI, but you don't have to worry because if StringCharset is "windows-1252", then whatever conversion is required (such as utf-8 to Windows-1252, or utf-16 to Windows-1252), it will happen automatically.

The semantics of SendBytes is that you are sending raw bytes, which are not necessarily bytes representing characters. The bytes could be, for example, the bytes of a JPG image. It wouldn't make sense for Chilkat to try to interpret the bytes passed to SendBytes as text characters.