GitHub through company proxy



  • I'm stuck trying to access GitHub through my company's proxy (ZScaler, and yes, I know 🤮) but weirdly (??) only on Windows.

    On Linux, it's fairly straightforward. I set the https_proxy environment variable (or HTTPS_PROXY) and then git clone https://github.com/whatever/whatever.git works.

    (cloning through SSH never works but this seems to be because ZScaler blocks SSH and no, redirecting SSH to use 443 doesn't work -- but I don't really care, as long as one method (HTTPS) works I don't need SSH)

    I got the value to use in https_proxy in two ways that work equally well in practice: option 1 is to go to http://ip.zscaler.com which shows 5 or 6 different IPs for things I have no idea what they mean, but at least one of those works. Option 2 is to ask a coworker who's already done this which value they used and copy that.

    A coworker told me that he managed to get it to work on Windows, using the same method as on Linux (set proxy, clone through https). But me, I systematically get a Received HTTP code 403 from proxy after CONNECT error.

    At this point and after much checking, I suspect that by trying too many things I actually broke something (:surprised-pikachu:) and now I need to undo something so that it works again. But what?

    I've checked that my .gitconfig doesn't contain anything different between Windows and Linux, and nothing proxy-related (in fact, it only contains my username and email). I've checked that I don't have too many variations of HTTP_PROXY / http_proxy / https_proxy / ... that might interfere.

    What else could cause it?


  • And then the murders began.

    @remi Did you maybe set the proxy as an environment variable?



  • @remi said in GitHub through company proxy:

    On Linux, it's fairly straightforward. I set the https_proxy environment variable (or HTTPS_PROXY) and then git clone https://github.com/whatever/whatever.git works.

    The non-S http_proxy is the odd one out in lowercase, all the others like HTTPS_PROXY and NO_PROXY are uppercase. Though curl should look for both cases.

    I got the value to use in https_proxy in two ways that work equally well in practice: option 1 is to go to http://ip.zscaler.com which shows 5 or 6 different IPs for things I have no idea what they mean, but at least one of those works. Option 2 is to ask a coworker who's already done this which value they used and copy that.

    That sounds excessively complicated. The HTTPS_PROXY variable can have a DNS name just fine. It can also have a user and password if needed.

    Since you are going through the proxy from your browser in Windows—otherwise the ip.zscaler.com wouldn't tell you much—can you look at the Windows configuration?

    What else could cause it?

    It sounds like you are connecting to the wrong address.

    The setting that your Windows browser uses has to work, so you should start with that. With two complications:

    • If the proxy setting uses a “proxy autoconfiguration script”, you'll have to read and interpret it yourself, since curl does not have a JavaShcrapt engine to do that.
    • If the proxy needs authentication, basic authentication works fine if you just give the username and password in the variable (so like HTTPS_PROXY=http://user@password:proxy.nocom:6666), but if you …

    Actually, the git proxy configuration is a bit more flexible and does support even NTLM and GSS authentication.

    But I guess you don't need authentication, you are not getting 407, you are getting 403 and that means that either

    • you are trying to use the wrong proxy, or
    • you are for some reason not allowed to access github.com from the computer you are on.

    Since the environment variables are standard curl method for setting proxy, you can test them with the curl that comes with the git bash, and that will also show you if the proxy gives you some extra information in the 403 status (use -i option to curl to get the headers printed along the response).


  • Discourse touched me in a no-no place

    If it's just ZScaler Internet Access then have you tried with the proxy set to 127.0.0.1:9000?



  • @loopback0 said in GitHub through company proxy:

    If it's just ZScaler Internet Access then have you tried with the proxy set to 127.0.0.1:9000?

    That seems to work!

    Though I'm saying "seems" because last week when I posted the first message here I was on VPN (work from home) and today I'm in the office (so no VPN needed) and I did not reproduce last week's errors (I had other, different errors...). So it is entirely possible that this solution might not work from home. I'll check next time I'm there (!).

    Thanks anyway.



  • To answer the other comments (thanks also):

    @Unperverted-Vixen said in GitHub through company proxy:

    @remi Did you maybe set the proxy as an environment variable?

    Yes, this is the main way I'm using to set the proxy. Either through the system dialog for setting environment variables, or directly in whatever terminal I'm using to test.

    I always assumed it would take precedence over the system proxy settings, or rather that git wasn't using the system proxy settings (since without changing those environment variables git isn't working!). I'd say that the working solution does confirm this, but maybe I missed something else.

    @Bulb said in GitHub through company proxy:

    @remi said in GitHub through company proxy:

    I got the value to use in https_proxy in two ways that work equally well in practice: option 1 is to go to http://ip.zscaler.com which shows 5 or 6 different IPs for things I have no idea what they mean, but at least one of those works. Option 2 is to ask a coworker who's already done this which value they used and copy that.

    That sounds excessively complicated.

    Welcome to ZScaler... :rolleyes:

    The HTTPS_PROXY variable can have a DNS name just fine. It can also have a user and password if needed.

    I know that, but the problem is finding which values to use. Well, no username/password is needed in my case, but I don't have a clear name, or IP, for the proxy. I am not "supposed" to need it so it's not documented anywhere (that I know of), and it can (and sometimes does...) change from time to time, so I'm left fishing various places to find something that works, and hoping it won't change too soon.

    Since you are going through the proxy from your browser in Windows—otherwise the ip.zscaler.com wouldn't tell you much—can you look at the Windows configuration?

    It uses a PAC file. Which, as you say, means reading it to see what it really uses. And it turns out that now it's using 127.0.0.1:9000 (which @loopback0 suggested). I still have an older version saved on my disk where it was using something different (a list of at least 3 different IPs that rotated, AFAICT). I assumed this was still the case so I did not think about reading it again (that is, reading the current version).

    Obviously this would also have solved my problem, so thanks for pointing it out. Basically I need to remember that whatever I might have remembered from the PAC may have changed the next time I need it (:headdesk:).

    Since the environment variables are standard curl method for setting proxy, you can test them with the curl that comes with the git bash, and that will also show you if the proxy gives you some extra information in the 403 status (use -i option to curl to get the headers printed along the response).

    I'll try and remember that. In particular next time I work from home, if the VPN makes the current solution not-work...



  • @remi said in GitHub through company proxy:

    It uses a PAC file.

    Yeah, should have expected that. The mere existence of PAC is an insult to sanity (and to security).

    @remi said in GitHub through company proxy:

    In particular next time I work from home, if the VPN makes the current solution not-work

    I think it will work now, because now there is a local downstream proxy that finds the correct upstream for you.
    But without that, the list of IP addresses is almost certainly network-dependent, which is why the setup that worked for your colleague didn't work for you.



  • @Bulb said in GitHub through company proxy:

    But without that, the list of IP addresses is almost certainly network-dependent, which is why the setup that worked for your colleague didn't work for you.

    I expect so as well, but then again for some other things in the past where I needed the proxy IP, using either of the methods I described did work.

    So maybe those other times I was lucky, I don't know. :mlp_shrug: Being a well-trained monkey I remembered that if I did that then I got a candy rather than an electric shock, and thus kept doing that.

    The mere existence of PAC is an insult to sanity (and to security).

    Based on how many times it had made my life harder, I would say that this applies to ZScaler as a whole. But it's not like I have any choice.



  • @remi said in GitHub through company proxy:

    Based on how many times it had made my life harder, I would say that this applies to ZScaler as a whole.

    ZScaler is just a proper WTF. Just like many other similar security theatre crap (I had netskope for a while, which was a transparent proxy to boot; it broke quite a few things).

    The proxy autoconfiguration script pseudo-standard as a whole though, that's a real purple abomination. The browsers have long since taken to interpret it in a special sandbox with only a very few functions, because otherwise it was too dangerous as it's quite easy to spoof, but still it's the worst tool for the job anybody could choose.


  • Notification Spam Recipient

    @remi said in GitHub through company proxy:

    Basically I need to remember that whatever I might have remembered from the PAC may have changed the next time I need it ().

    Yeah, it was a misguided thing to Auto Configure All The Things :magnets_having_sex: ...



  • @Tsaukpaetra Toby Fair, when it's only limited to browsers and mailer and maybe a couple of other very standard applications, the PAC things does work in practice (for me -- I'm not saying the idea is a good one, just that I don't have to think about it in most cases).

    It's only when trying to go beyond those things and "manually" accessing the network that I get issues. Which, again Toby Fair, really isn't unexpected when there is an "automatic" system (it works until you get out of what it's designed to do...).

    So yeah, purple abomination, but I can see why it's used.


Log in to reply