Come 2019, there is a new BERT Big-&-Extending-Repository-of-Transformers, which is related to this paper: https://arxiv.org/abs/1810.04805. It is a new language model.
There were 6 news questions in the tag all related to that tag. I renamed the tag in its entirety to bert-language-model.
So for now, we can consider this small uprising from the bert-walkers to be quashed.
Update on June 30th 2019:
Apparently some users went ahead and recreated the bert tag, and there are 25 questions tagged with it now. I retagged them all to the bert-language-model tag again.
Update on Aug 2nd 2019
My monthly check on the health of this tag revealed 5 new questions in the bert tag, which I retagged.