forked from keighl/barkup
-
Notifications
You must be signed in to change notification settings - Fork 0
/
Copy pathdoc.go
163 lines (113 loc) · 4.67 KB
/
doc.go
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
/*
Package barkup is a library for backing things up. It provides tools for writing bare-bones backup programs in Go. The library is broken out into exporters and storers. Currently, those are: MySQL, Postgres, S3
Quick Example
Here's a go program that backups up a MySQL database (Exporter) to an S3 bucket (Storer) using barkup. The resulting executable is plopped on a server somewhere and scheduled to execute via CRON.
package main
import "github.com/keighl/barkup"
func main() {
// Configure a MySQL exporter
mysql := &barkup.MySQL{
Host: "localhost",
Port: "3306",
DB: "production_db",
User: "root",
Password: "cheese",
}
// Configure a S3 storer
s3 := &barkup.S3{
Region: "us-east-1",
Bucket: "backups",
AccessKey: "*************",
ClientSecret: "**********************",
}
// Export the database, and send it to the
// bucket in the db_backups folder
err := mysql.Export().To("db_backups/", s3)
if (err != nil) { panic(err) }
}
...
$ go build
...
@hourly /path/to/backup-program
Exporters
Exporters provide a common interface for backing things up via the Export() method. It writes an export file to the local disk, and returns an ExportResult which can be passed on to a storer, or to another location on the disk.
// Exporter
mysql := &barkup.MySQL{...}
// Export Result
result := mysql.Export()
if (result.Error != nil) { panic(result.Error) }
// Send it to a directory path on a storer
err := result.To("backups/", storer)
// OR just move it somewhere on the local disk
err := result.To("~/backups/", nil)
MySQL
The mysql exporter uses mysqldump to pump out a gzipped archive of your database. mysqldump must be installed on your system (it probably is if you're using mysql), and accessible to the user running the final program (again, it probabaly is).
mysql := &barkup.MySQL{
Host: "127.0.0.1",
Port: "3306",
DB: "XXXXX",
User: "XXXXX",
Password: "XXXXX",
// Any extra mysqldump options
Options: []string{"--skip-extended-insert"}
}
// Writes a file ./bu_DBNAME_TIMESTAMP.sql.tar.gz
result := mysql.Export()
if (result.Error != nil) { panic(result.Error) }
Postgres
The postgres exporter uses pg_dump to make a gzipped archive of your database. pg_dump must be installed on your system (it probably is if you're using postgres).
postgres := &barkup.Postgres{
Host: "127.0.0.1",
Port: "5432",
DB: "XXXXXXXX",
// Not necessary if the program runs as an authorized pg user/role
Username: "XXXXXXXX",
// Any extra pg_dump options
Options: []string{"--no-owner"},
}
// Writes a file ./bu_DBNAME_TIMESTAMP.sql.tar.gz
result := postgres.Export()
if (result.Error != nil) { panic(result.Error) }
Connection credentials
You have two options for allowing barkup to connect to your DB. Add a [~/.pgpass](http://www.postgresql.org/docs/9.3/static/libpq-pgpass.html) for account that will run the backup program.
Or, run the backup program from an authenticated user, like postgres:
$ sudo -i -u postgres
$ ./backup-program
RethinkDB
The RethinkDB exporter uses `rethinkdb dump` to make a gzipped archive of your cluster. `rethinkdb` must be installed on your system.
Usage
rethink := &barkup.RethinkDB{
Name: "nightly",
Connection: "0.0.0.0:28015",
// You can specify specific databases and/or tables to dump (by default it dumps your whole cluster)
Targets: []string{"site", "leads.contacts"},
}
// Writes a file `./bu_nightly_TIMESTAMP.tar.gz`
result := rethink.Export()
if (result.Error != nil) { panic(result.Error) }
Storers
Storers take an ExportResult object and provide a common interface for moving a backup to someplace safe.
// For chaining an ExportRestult
err := someExportResult.To("backups/", someStorer)
// OR
err := someStorer.Store(someExportResult, "backups/")
S3
The S3 storer puts the exported file into a bucket at a specified directory. Note, you shouldn't use your global AWS credentials for this. Instead, [create bucket specific credentials via IAM.](http://blogs.aws.amazon.com/security/post/Tx3VRSWZ6B3SHAV/Writing-IAM-Policies-How-to-grant-access-to-an-Amazon-S3-bucket)
s3 := &barkup.S3{
Region: "us-east-1",
Bucket: "backups",
AccessKey: "XXXXXXXXXXXXX",
ClientSecret: "XXXXXXXXXXXXXXXXXXXXX",
}
err := someExportResult.To("data/", s3)
Region IDs
* us-east-1
* us-west-1
* us-west-2
* eu-west-1
* ap-southeast-1
* ap-southeast-2
* ap-northeast-1
* sa-east-1
*/
package barkup