# [Vue.js] Axios Request - Gzip data from PHP API

is it possible to gzcompress data in PHP and then have Axios request it?

I’ve tried doing this but keep getting this error: “Malformed UTF-8 characters, possibly incorrectly encoded.”

My Axios request looks like this:

axios({
method: ‘get’,
url: ‘https://someapi.com/api/test',
data: { },
config: { headers: { ‘Content-Type’: ‘application/json’, ‘Accept-Encoding’: ‘gzip’ }
})
.then(response => {
response.data.forEach(el => {
this.transactions.push(JSON.parse(el));
this.transactionsFull = this.transactions;
});
console.log(this.transactions);
})
.catch(e => {
this.errors.push(e)
})

$result = openssl_decrypt($cipher_text, ‘aes-256-gcm’, $key, OPENSSL_RAW_DATA,$iv, $auth_tag);$json = json_decode($result);$channel = Channel::where(‘uuid’, $json->payload->authentication->entityId)->first();$gzencode = gzencode(json_encode(array(‘transaction’ => $json, ‘relation’ => json_decode($channel))), 8);

Redis::lpush(‘transactions_gzencode’, $gzencode);$length = 0;
$transactions = Redis::lrange(‘transactions_gzencode’, 0, -1); foreach($transactions as $item) {$length += strlen($item); } header(‘Content-Encoding: gzip’); header(‘Content-Type: application/json’); header(‘Content-Length: ‘ .$length);
return $transactions; ### Solution : I believe that axios is not able to decompress gzip, but the browser should be able to do it before axios even touches the response. But for the browser to do so you must respond with the proper http header and format. Note that to use the compressed data in the http response body you must use gzencode, not gzcompress, according to php documentation. Example PHP:$compressed = gzencode(json_encode([‘test’ => 123]));
header(‘Content-Length: ‘ . strlen($compressed)); echo$compressed;

Example JS:

console.log(await (await fetch(‘/test’)).json());
// {test: 123}

Edit

Since what you are trying to do is to send an array of individually compressed items, you can output the data in a JSON encoded array of base64 encoded binary compressed data.

Example of how to use pako.js to decompress the array of compressed transactions returned from the server:

PHP:

$transactions = [‘first’, ‘second’, ‘third’]; echo json_encode(array_map(‘base64_encode’, array_map(‘gzencode’,$transactions)));

JS:

(async () => {
const transactions = (await (await fetch(‘/test’)).json())
.map(atob)
.map(blob => pako.inflate(blob, { to: ‘string’ }));

console.log(transactions);
})();

Notice that now I didn’t include the headers, cause when just sending a regular json encoded array.

The downside of this approach is that there won’t be much benefit on compressing the data, since it is being converted to base64 before sending to the client. It is necessary to encode as base64, because otherwise json_encode would try to handle the binary data as string and it would lead to string encoding errors.

You can still compress the resulting json encoded string before sending to the client like was done I the previous answer, but I’m not sure if the compression would still be good enough:

$compressedTransactions = array_map(‘gzencode’, [‘first’, ‘second’, ‘third’]);$compressed = gzencode(json_encode(array_map(‘base64_encode’, $compressedTransactions))); header(‘Content-Type: application/json’); header(‘Content-Encoding: gzip’); header(‘Content-Length: ‘ . strlen($compressed));
echo \$compressed;