The short answer is that it's a cache issue that will clear itself up for most folks. The longer answer is slightly more complicated.
I'll try to see what we can do without sacrificing too much the benefits of how the avatars are being displayed. For that explanation keep reading.....
To help load up page load times we use a number of techniques that include aggressive image caching and, more importantly, Amazon's S3 for image storage and Amazon's CloudFront for CDN.
So what you say, what does that have to do with avatars?
OK, bear with me for a minute because to answer that question I need to explain how the S3 and ClounFront stuff works at a really high level.
If you envision an "average" web site the software is running on a server and all of the images, files, and misc. content is stored on the same server. That works but is not really ideal for sites that use a lot of resources, like images, or may have visitors from all over the world.
Amazon S3 eliminates the problem of a lot of resources like images. When an image is uploaded to site the image is transferred to Amazon's S3 'cloud' storage instead of being stored on the local server. By doing that the idea of worrying about hard drive space from a lot of images being uploaded goes away; instead of taking up hard drive space on the site server the S3 cloud offers a nearly endless amount of space. The site itself is still being served from the site's server and the data is still stored on the site's server but the images are not, they are all stored on Amazon's S3 servers.
S3 eliminates worrying about drive space but when it comes to an image intensive site like ours the problem comes into how fast it takes to load a page. The sites server itself is physically in the USA. Amazon's S3 servers that we use are in the USA. Our members though are all over the world. When a visitor from the USA loads a page with a lot of images then the load time might seem fine but a visitor from Australia [
just an example country chosen at random] could possibly take a lot longer to load the same page. That is because they have more 'hops' between their local PC in Australia and to the sites servers in the USA. The more connections to reach the server means the longer it takes to retrieve the content and show the page. For the data that really isn't an issue because text is pretty small and even with a large number of hops it comes back pretty fast. Images, on the other hand, can really slow a page view down. When a typical gallery thread here has easily a dozen or so images per page and each image could be up to a few megabytes in size each, trying to retrieve that much data from a server in the USA while sitting in a cafe in Australia could result in a really long page load time. That is where Amazon's CloudFront CDN comes in.
CloudFront is a bunch of servers, like S3, but instead of sitting in a single giant datacenter somewhere in the US, like our S3 servers do, the CloudFront servers are distributed in data centers in major cities all over the world. In the case of Australia, CloudFront servers are in both Sydney and Melbourne. After an image is uploaded to the site it is transferred to the S3 servers. When the visitor in Australia wants to view the images Amazon CloudFront acts as a man-in-the-middle and decides that since the visitor is in Australia then it will copy the image from the S3 server to the local CloudFront server that is physically in Australia and load it from there instead of the US servers. Now the visitor displaying the page in Australia is still requesting the page data from the sites US server but all of the images are able to be loaded way faster because they are coming from a server that is local to them. If it is the first time somebody from Australia is requesting the page then the images will be copied from the US S3 server to the local CloudFront server. When the visitor loads the page a second time, or a different visitor from Australia requests the page, then they'll both see the images that are being cached on their local CloudFront server. The same thing then happens with visitors all of the world where Amazon has CloudFront servers; the data is coming from the sites US server but the images are actually coming from a server that is much closer to them.
So an image gets uploaded to the site (Cameraderie), it gets transferred to the cloud (S3), and a copy is stored locally in cities all over the world (CloudFront). Even with all that though the site sets an aggressive cache setting so that your browser will cache the image to your local device instead of requesting a fresh copy every time. That means that, as long as nothing funky is going on, the first time you request a page and it gets displayed in your browser then the images are temporarily stored in your devices browser cache. The second time you view that page then the images are displayed really fast because they are no longer being downloaded, they are coming from your device.
This works great for the attachments because attachment images don't change. There is no way to replace an existing attachment image with a different image. You can delete the existing attachment image and upload a new attachment image but you are not replacing the first one, you are adding one. Even if you give it the same name they are physically two entirely different attachments. They could have the exact same name, same file size, same everything, doesn't matter, they are two different files physically.
And now we come to how the site software handles avatars. Same way as attachments, right? No, no, that'd be too easy. Avatar images do replace the current one. That means when you upload a new avatar it replaces the existing one.
Now at this point some of you may see where I'm going with this.
Avatars, when updated, go the S3 servers. When the pages requests them they are cached to the CloudFront servers. When they are displayed in your browser they are cached in your device browser. Eventually your browser cache will expire to request a new version of the avatar which will come from the CloudFront server which itself may or may not be caching the old version of the avatar so when that cache expires it'll request a new copy from the S3 server. In the interim your browser is still showing you the old avatar.
So why doesn't this happen to all sites? It could be a number of answers. Some sites don't store their avatars in S3 and/or CloudFront and instead still store the avatar images directly on the sites server. There are different vendors out there besides Amazon, like CloudFlare, and based on how the site is setup and how the vendor configuration is setup then it might be smart enough to know the image has changed faster. Some sites don't cache their avatars.
So why does this site cache avatars? To squeeze every possible millisecond out of the time it takes to load a page and display. For the most part members don't change their avatars very often so it usually goes unnoticed or is just dismissed as an annoying quirk.
So what can be done about it? Honestly, with us using S3 and CloudFront I don't have a quick answer at the moment. I'm not sure if the 'annoyance' factor at the moment outweighs the potential time savings during the page loads. It's something I'll revisit but I don't have an answer other than to change your avatar in your account settings like normal and if you don't see the new avatar right away then to just wait a bit.