Assuming that the data in your file should be taken as a raw string, it would need to be JSON-encoded. It is easiest to do that with a program that understands JSON, like jq:
json='
{
"actions": [
{
"data": { "from": "test", "message": "", "to": "test2" },
"name": "transfer"
}
],
"sec": 0,
"usage": 0
}
'
json=$( jq --arg value "$(cat "$file")" '.actions[0].data.value = $value' <<<"$json" )
cos -u 'https://myapi.com' push data "$json"
Note that $value in the jq expression is not a shell variable but an internal jq variable whose value is given on the command line with --arg and that it's automatically JSON-encoded.
A slightly more convenient way to handle the data in $file without having to expand it on the command line (which would matter if the amount of data is large) is to convert the data to a JSON string separately and pass it to the jq that inserts it in the correct place via a process substitution:
jq '.actions[0].data.value = input' - <( jq -Rs . data ) <<<"$json"
You could also use jo to create the data object with the value key's value read from the $file file and then just insert that with a shell variable:
datajson=$( jo from=test message="" to="test2" value=@"$file" )
json='
{
"actions": [
{
"data": '"$datajson"',
"name": "transfer"
}
],
"sec": 0,
"usage": 0
}
'
cos -u 'https://myapi.com' push data "$json"
Note how we temporarily break out of the single-quoted string that makes up the JSON document to insert the quoted expansion of $datajson.
'disable interpolation of variables, so you will need to use"... $name ..."instead, and then escape all the internal"with\"