This actually works nicely (we have > 2k repos) but I would really like it to start producing results right away as soon as it has any. Mostly so it seems faster than it is (by being more responsive).
bbrepos(){
#clone[1] is ssh using a filter made the escaping really ugly.
#If it becomes necessary we can use a jq arg
# get max number of repos, ceil(repos / 50), then create a page
#sequence and curl them all
local url=https://api.bitbucket.org/2.0/repositories/twengg
local project=${1:+"q=project.key=\"$1\""}
curl -snL "$url?pagelen=0&page=1" | jq '"$(((\(.size) / 50) + 1 ))"' \
| xargs bash -c 'eval "echo $1"' _ | xargs seq 1 | xargs -L 1 printf "?pagelen=50&page=%s&${1:-$project}\n" \
| xargs -I {} -L 1 -P 20 -I {} bash -c 'curl -snL "$1/$2" | jq -er .values[].links.clone[1].href' _ $url {} \ #clone[1] is ssh
| sort -u
}
Sample workflow:
bbsearch something | xargs -P 20 git clone
Alternatively, it might also be handy to have these other workflows:
bbsearch something other thing | xargs -P 20 git clone
cat things-to-find.txt | bbsearch | xargs -P 20 git clone
Sample payload (it's normally huge, so I stripped it down to what is needed):
{
"pagelen": 1,
"size": 3054,
"values": [
{
"links": {
"clone": [
{
"href": "https://[email protected]/twengg/development-process.git",
"name": "https"
},
{
"href": "[email protected]:twengg/development-process.git",
"name": "ssh"
}
],
"self": {
"href": "https://api.bitbucket.org/2.0/repositories/twengg/development-process"
}
}
}
],
"page": 1,
"next": "https://api.bitbucket.org/2.0/repositories/twengg?page=2"
}
.values[].linksstructure \$\endgroup\$