How To Convert From JSON to YAML With Ease
I am not going to go into the holy war of the JSON vs YAML discussion. That is something that should be left to folks that have time and willingness to discuss this, just like discussions of OS X vs Windows vs Linux, or VMware vs Xen, or iOS vs Android, or …the list could continue way past this blog. But this blog is not to discuss either/or. It is to show you a way to convert from one to another to ease the pain of dealing with one or the other side. Let’s take a simple JSON template for DynamoDB table creation in AWS. The template is located at this address.
{
"AWSTemplateFormatVersion" : "2010-09-09",
"Description" : "AWS CloudFormation Sample Template DynamoDB_Table: This template demonstrates the creation of a DynamoDB table. **WARNING** This template creates an Amazon DynamoDB table. You will be billed for the AWS resources used if you create a stack from this template.",
"Parameters" : {
"HaskKeyElementName" : {
"Description" : "HashType PrimaryKey Name",
"Type" : "String",
"AllowedPattern" : "[a-zA-Z0-9]*",
"MinLength": "1",
"MaxLength": "2048",
"ConstraintDescription" : "must contain only alphanumberic characters"
},
"HaskKeyElementType" : {
"Description" : "HashType PrimaryKey Type",
"Type" : "String",
"Default" : "S",
"AllowedPattern" : "[S|N]",
"MinLength": "1",
"MaxLength": "1",
"ConstraintDescription" : "must be either S or N"
},
"ReadCapacityUnits" : {
"Description" : "Provisioned read throughput",
"Type" : "Number",
"Default" : "5",
"MinValue": "5",
"MaxValue": "10000",
"ConstraintDescription" : "must be between 5 and 10000"
},
"WriteCapacityUnits" : {
"Description" : "Provisioned write throughput",
"Type" : "Number",
"Default" : "10",
"MinValue": "5",
"MaxValue": "10000",
"ConstraintDescription" : "must be between 5 and 10000"
}
},
"Resources" : {
"myDynamoDBTable" : {
"Type" : "AWS::DynamoDB::Table",
"Properties" : {
"AttributeDefinitions": [ {
"AttributeName" : {"Ref" : "HaskKeyElementName"},
"AttributeType" : {"Ref" : "HaskKeyElementType"}
} ],
"KeySchema": [
{ "AttributeName": {"Ref" : "HaskKeyElementName"}, "KeyType": "HASH" }
],
"ProvisionedThroughput" : {
"ReadCapacityUnits" : {"Ref" : "ReadCapacityUnits"},
"WriteCapacityUnits" : {"Ref" : "WriteCapacityUnits"}
}
}
}
},
"Outputs" : {
"TableName" : {
"Value" : {"Ref" : "myDynamoDBTable"},
"Description" : "Table name of the newly created DynamoDB table"
}
}
}
Some people have no problem reading JSON and working with it’s “large” number of curly braces. I personally prefer to no have to do that. So, after looking for information and modifying it to suite my needs, I am now able to convert the above template into the following YAML file:
---
AWSTemplateFormatVersion: '2010-09-09'
Description: 'AWS CloudFormation Sample Template DynamoDB_Table: This template demonstrates
the creation of a DynamoDB table. **WARNING** This template creates an Amazon DynamoDB
table. You will be billed for the AWS resources used if you create a stack from
this template.'
Parameters:
HaskKeyElementName:
Description: HashType PrimaryKey Name
Type: String
AllowedPattern: "[a-zA-Z0-9]*"
MinLength: '1'
MaxLength: '2048'
ConstraintDescription: must contain only alphanumberic characters
HaskKeyElementType:
Description: HashType PrimaryKey Type
Type: String
Default: S
AllowedPattern: "[S|N]"
MinLength: '1'
MaxLength: '1'
ConstraintDescription: must be either S or N
ReadCapacityUnits:
Description: Provisioned read throughput
Type: Number
Default: '5'
MinValue: '5'
MaxValue: '10000'
ConstraintDescription: must be between 5 and 10000
WriteCapacityUnits:
Description: Provisioned write throughput
Type: Number
Default: '10'
MinValue: '5'
MaxValue: '10000'
ConstraintDescription: must be between 5 and 10000
Resources:
myDynamoDBTable:
Type: AWS::DynamoDB::Table
Properties:
AttributeDefinitions:
- AttributeName:
Ref: HaskKeyElementName
AttributeType:
Ref: HaskKeyElementType
KeySchema:
- AttributeName:
Ref: HaskKeyElementName
KeyType: HASH
ProvisionedThroughput:
ReadCapacityUnits:
Ref: ReadCapacityUnits
WriteCapacityUnits:
Ref: WriteCapacityUnits
Outputs:
TableName:
Value:
Ref: myDynamoDBTable
Description: Table name of the newly created DynamoDB table
Solution
Now, in my personal opinion, the above YAML version is much easier to read than the JSON version. If you do have similar notions, please take a look at the below script. It is mostly self explanatory and also displays a bit of help if you miss something on command line. What is does though is convert one format into another and back again. This lets me work on the CloudFormation templates in YAML then convert them into proper JSON for upload into AWS.
#!/usr/bin/env ruby
require 'yaml'
require 'json'
require 'optparse'
require 'ostruct'
require 'fileutils'
#quit unless our script gets two command line arguments
unless ARGV.length == 3
puts "Dude, not the right number of arguments."
puts "Usage: ruby YJ_Convert.rb [-j][-y] json_file.json yaml_file.yaml\n"
exit
end
$json_file = ARGV[1]
$yaml_file = ARGV[2]
options = OpenStruct.new
OptionParser.new do |opt|
opt.on('-j', '--json', 'Convert to JSON') { |o| options.json = o }
opt.on('-y', '--yaml', 'Convert to YAML') { |o| options.yaml = o }
end.parse!
case
when options.yaml == true
y_file = File.open("#{$yaml_file}", 'a')
y_file.write(YAML.dump(JSON.parse(IO.read($json_file))))
y_file.close
puts "Converted to YAML. Output file is #{$yaml_file}"
when options.json == true
j_file = YAML.load_file(File.open("#{$yaml_file}", 'r'))
File.write "#{$json_file}", JSON.pretty_generate(j_file)
puts "Converted to JSON. Output file is #{$json_file}"
end
Hope this helps someone with JSON to YAML to JSON conversion necessity.