How can a USB-RS232 converter manually assert its TX pin?
I made a custom PICAXE-based PCB that integrates all of the components needed to program the microcontroller. I assumed everything was fine because I could use it to program under Windows just fine. However, under Mac OSX I was unable to program it. Oh, and I am using the WCH CH340G USB-RS232 chip to reduce costs, as the FTDI FT232RQ is over 7x the price, and I don't need all of its features. I was thoroughly convinced that there was something electrically wrong with my design that caused the CH340G to misbehave under OSX... I had modified my board so I could perform a loopback test under Coolterm, and transmission would freeze up every several characters. Clearly, something was messed up, and I thought that this was a bug in the CH340G OSX driver. WCH technical support assured me that it wasn't, and that it could be a counterfeit chip. During this witch hunt, I picked up an off-the-shelf USB-RS232 converter based on the same chip, and it worked flawlessly. So much for a "driver bug" causing my headache! I then built up another unit with the minimum amount of components required to perform the loopback test (i.e. no PICAXE), and I was able to get it to the point where the loopback test was reliable and no longer froze. Therefore, I likely do have something going on with the circuit that caused the freeze, but for now, I could continue on by wiring my PCB to a breadboard and using the DIP variety for external components to complete the programming circuit. Now on to the meat of my question... I used two test setups - 1) my PCB wired to a breadboard with the PICAXE, and 2) a breadboard with the PICAXE and the OTS USB-RS232 converter. I connected each to my Windows laptop and wired up the signals that I thought were most relevant to my Saleae Logic 8 Pro. I captured the signals when programming under Windows, and then moved the USB cable over to my iMac and captured the signals again. Here's what I ended up seeing: So the obvious difference here is that the "Serial In" signal is not going logic high when the programming process starts. This is mandatory because that's how the PICAXE programming protocol works (or at least, that's what it looks like). You have to assert the TX line over RS232, which puts the PICAXE into a mode where it transmits a square wave, followed by bits that identify the chip type. It seems pretty clear from the logic analyzer capture that the CH340G driver under OSX isn't allowed to assert its TX pin, but the Windows version can. In trying to gather as much information as possible so I can plead with WCH to help me out with this problem, I am trying to understand how it is even possible to assert TX at all. In the Windows world, I believe everything ends up as a WIN32API call. I looked at the FTDI driver documentation just as a frame of reference, and I can see that it has functions for setting the state of the DTR and RTS, for example, but not a way to set the state of TX. On the Windows API side, you write data out of TX by calling WriteFile. There isn't a way that I know of that allows you to just say "hold TX low" or "hold TX high". Can anyone explain how this might be achieved on either Windows or OSX, so that I can continue with my investigation and debugging? There must be a way to do it that I'm missing, or proof that there really is something missing in WCH's driver. After all, I do have the PICAXE AXE027 programming cable as well, and it is able to program the chips when running under OSX. If you know which functions are used under Windows and/or OSX to achieve logic-level control over the TX line, that would also be extremely helpful! I didn't know about the break feature. It seems like this could be what the WCH driver is missing under OSX. In my ongoing quest to figure out how to make this chip work, I loaded Ubuntu 16.04 LTS, which has the WCH driver built-in (at least, I think it's from WCH). I ran the programming tool under Ubuntu and it also doesn't work. The WCH source does not compile under 16.04 due to the difference in kernel version. I tried to load the version I built on 12.04, but it won't load. I can't test under 12.04 because Chrome isn't supported under that OS anymore.
Re: How can a USB-RS232 converter manually assert its TX pin?
From a long history of programming serial ports and drivers, the most general case is that Rx and Tx are driven directly from a UART. They can't be mapped as general I/O pins which is what you need to assert control and a lot of UARTs don't allow those lines to be set high or low.
Of course, your particular chipset may support it. Get the chip datasheet and see. Then, if it does, you'll need to confirm that the driver on your target O/S supports it. OS X is just another BSD Unix sitting underneath the gloss so it'll be using ioctls to perform those functions. In the bad old days (which is still now if you happen to be working in embedded environments) when we need tighter control than is offered by the standard UART we'd use alternate pins and just bit bash them with data, effectively reimplementing the UART in software. Not an option for you with a USB bridge in between though, because of the latency and jitter over USB.
You could do it by replacing your WCH with an Atmel, PIC, ARM etc device that supported USB and serial but there's a world of pain in front of you down that road. This is, however, what most do in their programmers now - especially then there's a very specific "almost standard serial but with a couple of special features" protocol used for the programming. Serial driver, USB, device programmed as a straight passthrough but with pin manipulation to handle those edge cases and then out.
Now might be a good time to stop and re-evaluate, too. What are you trying to achieve? If it's just a cheap PIC programmer, what's your time worth: if anything then just buy a programmer and get on with whatever it was you wanted to do once the programmer was working. Or back up and run it off a Windows VM on your Mac.