The problem
Say you have a table that looks like this:
AGGREGATE_NEEDED 1
ARCH x86_64
BASE_TEST_ISSUES NUMBER
BUILD :NUMBER:PACKAGE
DISTRI DISTRIBUTION
FLAVOR Server-DVD-Incidents-Install
INCIDENT_ID 99999
It’s just that it contains about 78 or more entries. Of course for a very skilled engineer or a person with a lot of tricks under the hood, this might be a very trivial task in vim or something like this, I guess that with a couple of replaces here and there, you’d get somewhere; but I’m not skilled, other than at eating.
The Solution
So I took my Key Value table saved it to a file and after googling a bit, now I’m more versed into awk :D:
cat FILE.txt | \
awk 'BEGIN { print "{" } \
{ printf "\"%s\":\"%s\",", $1,$2} \
END { print "\"MANUALLY_GENERATED_ISO_POST\":1 }" }'
| jq > x86_64-ready.json'"}'
I guess this could have been done easier and prettier, but fits my need and might save you too at some point. Just make sure you have jq
installed ok?
Top comments (4)
I find scripting languages to be much easier to write this stuff in and easier to fix later.
You can also just use the Node REPL for one-off conversions. I use this pattern for CSV to JSON all the time, which would be a nightmare using awk.
I guess something similar could be done in Perl, Ruby or any other language capable of reading files and with json object representation capabilities. Thing is, I end up using awk a lot, for many things. But I'll keep your idea in hand and maybe update with a easier to follow $script version. If I ever have to do that again :)
Why not just
???
@that's even better! \o/ :) Will keep it in mind next time I need to do this! Thanks!