Compare commits

..

12 Commits

Author SHA1 Message Date
Bret Comnes
7f3db0b932 1.0.1 2020-02-17 21:21:54 -07:00
Bret Comnes
91c25ab722 docs: typos 2020-02-17 21:21:15 -07:00
Bret Comnes
2ac7b56408 1.0.0 2020-02-17 21:19:33 -07:00
Bret Comnes
3024afb807 docs: add missing example 2020-02-17 21:18:51 -07:00
Bret Comnes
60d959d7ed docs: update logo 2020-02-17 21:14:22 -07:00
Bret Comnes
6d0a96de91 Merge branch 'master' of github.com:bcomnes/deploy-to-neocities 2020-02-17 21:12:54 -07:00
Bret Comnes
da5f527254 docs: Document action, inputs and outputs. 2020-02-17 21:12:06 -07:00
Bret Comnes
6bcb1bafa6 Merge pull request #1 from bcomnes/greenkeeper/initial
Update dependencies to enable Greenkeeper 🌴
2020-02-13 12:20:26 -07:00
Bret Comnes
df9c0b9fe1 Update README.md 2020-02-13 12:19:42 -07:00
greenkeeper[bot]
bcc9d90095 docs(readme): add Greenkeeper badge 2020-02-13 19:16:16 +00:00
greenkeeper[bot]
7092d0c55c chore(package): update dependencies 2020-02-13 19:16:12 +00:00
Bret Comnes
6b86fa30b4 bug: remove extraneous logging 2020-02-13 12:08:54 -07:00
12 changed files with 281 additions and 172 deletions

View File

@@ -7,6 +7,24 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0
Generated by [`auto-changelog`](https://github.com/CookPete/auto-changelog).
## [v1.0.1](https://github.com/bcomnes/deploy-to-neocities/compare/v1.0.0...v1.0.1) - 2020-02-18
### Commits
- docs: typos [`91c25ab`](https://github.com/bcomnes/deploy-to-neocities/commit/91c25ab7221a139f318ed7ef4a6518d5a64debe8)
## [v1.0.0](https://github.com/bcomnes/deploy-to-neocities/compare/v0.0.11...v1.0.0) - 2020-02-18
### Merged
- Update dependencies to enable Greenkeeper 🌴 [`#1`](https://github.com/bcomnes/deploy-to-neocities/pull/1)
### Commits
- docs: Document action, inputs and outputs. [`da5f527`](https://github.com/bcomnes/deploy-to-neocities/commit/da5f52725420f64188f5130c7bb54fda5252e3ec)
- docs: add missing example [`3024afb`](https://github.com/bcomnes/deploy-to-neocities/commit/3024afb80791888e8c0fd04cf23ba6e7560a5f40)
- Update README.md [`df9c0b9`](https://github.com/bcomnes/deploy-to-neocities/commit/df9c0b9fe144eb76092e3998b2013e130756c950)
## [v0.0.11](https://github.com/bcomnes/deploy-to-neocities/compare/v0.0.10...v0.0.11) - 2020-02-13
### Commits

View File

@@ -2,8 +2,75 @@
[![Actions Status](https://github.com/bcomnes/deploy-to-neocities/workflows/tests/badge.svg)](https://github.com/bcomnes/deploy-to-neocities/actions)
<center><img src="logo.png"></center>
Efficiently deploy a website to [Neocities][nc].
### Pre-requisites
Create a workflow `.yml` file in your repositories `.github/workflows` directory. An [example workflow](#example-workflow) is available below. For more information, reference the GitHub Help Documentation for [Creating a workflow file](https://help.github.com/en/articles/configuring-a-workflow#creating-a-workflow-file).
Get your sites API token and set a [secret][sec] called `NEOCITIES_API_TOKEN`. Set the `api_token` input on your `deploy-to-neocities` action to `NEOCITIES_API_TOKEN`.
```
https://neocities.org/settings/{{sitename}}#api_key
```
During your workflow, generate the files you want to deploy to [Neocities][nc] into a `dist_dir` directory. You can use any tools that can be installed or brought into the Github actions environment.
This `dist_dir` should be specified as the `dist_dir` input. Once the build is complete, the `deploy-to-neocities` action will efficiently upload all new and all changed files to Neocities. Any files on Neocities that don't exist in the `dist_dir` are considdered 'orphaned' files. To destrucively remove these 'orphaned' files, set the `cleanup` input to `true`.
You most likely only want to run this on the `master` branch so that only changes commited to `master` result in website updates.
### Example workflow
```yaml
name: Deploy to neociteis
on:
push:
branches:
- master
jobs:
deploy:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v1
- name: Use Node.js
uses: actions/setup-node@v1
with:
node-version: 12
- name: Install deps and build
run: |
npm i
npm run build
- name: Deploy to neocities
uses: bcomnes/deploy-to-neocities@v1
with:
api_token: ${{ secrets.NEOCITIES_API_TOKEN }}
cleanup: false
dist_dir: public
```
### Inputs
- `api_token` (**REQUIRED**): The api token for your [Neocities][nc] website to deploy to.
- `dist_dir`: The directory to deploy to [Neocities][nc]. Default: `public`.
- `cleanup`: Boolean string (`true` or `false`). If `true`, `deploy-to-neocities` will destructively delete files found on [Neocoties][nc] not found in your `dist_dir`. Default: `false`.
### Outputs
None.
## See also
- [async-neocities](https://ghub.io/async-neocities): diffing engine used for action.
- [Neocities API Docs](https://neocities.org/api)
- [neocities/neocities-node](https://github.com/neocities/neocities-node): Node api
[qs]: https://ghub.io/qs
[nf]: https://ghub.io/node-fetch
[fd]: https://ghub.io/form-data
[nc]: https://neocities.org
[sec]: https://help.github.com/en/actions/configuring-and-managing-workflows/creating-and-storing-encrypted-secrets

View File

@@ -1,10 +1,10 @@
name: 'Deploy to Neocities'
description: 'Github Action to deplpoy a folder to Neocities.org'
description: 'Efficiently deplpoy a folder to Neocities.org'
branding:
icon: cat
color: yellow
icon: aperture
color: orange
inputs:
apiToken: # api token for site to deploy to
api_token: # api token for site to deploy to
description: 'Neocities API token for site to deploy to'
required: true
distDir:

View File

@@ -6,11 +6,10 @@ const ms = require('ms')
const assert = require('nanoassert')
async function doDeploy () {
const token = core.getInput('apiToken')
const distDir = path.join(process.cwd(), core.getInput('distDir'))
const token = core.getInput('api_token')
const distDir = path.join(process.cwd(), core.getInput('dist_dir'))
const cleanup = JSON.parse(core.getInput('cleanup'))
assert(typeof cleanup === 'boolean', 'Cleanup input must be a boolean "true" or "false"')
console.log(typeof cleanup)
const client = new Neocities(token)

BIN
logo.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 110 KiB

View File

@@ -7,6 +7,12 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0
Generated by [`auto-changelog`](https://github.com/CookPete/auto-changelog).
## [v1.1.6](https://github.com/bcomnes/async-neocities/compare/v1.1.5...v1.1.6) - 2020-02-13
### Commits
- refactor: limit decimal places in statHandler [`745b5c0`](https://github.com/bcomnes/async-neocities/commit/745b5c08b446723b837cf787e5911045cd0a01ed)
## [v1.1.5](https://github.com/bcomnes/async-neocities/compare/v1.1.4...v1.1.5) - 2020-02-13
### Commits

View File

@@ -49,7 +49,7 @@ function statsHandler (opts = {}) {
}
function logProgress (stats) {
let logLine = `Stage ${stats.stage}: ${stats.progress * 100}%`
let logLine = `Stage ${stats.stage}: ${(stats.progress * 100).toFixed(2)}%`
if (stats.bytesWritten != null && stats.totalBytes != null) {
logLine = logLine + ` (${prettyBytes(stats.bytesWritten)} / ${prettyBytes(stats.totalBytes)})`
}

View File

@@ -1,26 +1,26 @@
{
"_from": "async-neocities@1.1.5",
"_id": "async-neocities@1.1.5",
"_from": "async-neocities@1.1.6",
"_id": "async-neocities@1.1.6",
"_inBundle": false,
"_integrity": "sha512-7EHFjdmPACBQ2prEr7wUQsIi9bVXrD5tpR9PTmDndmZn5U+0uPe6dQxuk+xKI6BLizQVQVKjjODPTI+De5rlTA==",
"_integrity": "sha512-q5fTVttBaN9znGxqxxDAh/ks+bZngIJPu6zPS7nlbJLC9NnOhrcP5Mu0VntxgEBtAuaExyI6uH/C+CxKSW0LeQ==",
"_location": "/async-neocities",
"_phantomChildren": {},
"_requested": {
"type": "version",
"registry": true,
"raw": "async-neocities@1.1.5",
"raw": "async-neocities@1.1.6",
"name": "async-neocities",
"escapedName": "async-neocities",
"rawSpec": "1.1.5",
"rawSpec": "1.1.6",
"saveSpec": null,
"fetchSpec": "1.1.5"
"fetchSpec": "1.1.6"
},
"_requiredBy": [
"/"
],
"_resolved": "https://registry.npmjs.org/async-neocities/-/async-neocities-1.1.5.tgz",
"_shasum": "b9d31cf903ff6253beaf9012201bfae97908436f",
"_spec": "async-neocities@1.1.5",
"_resolved": "https://registry.npmjs.org/async-neocities/-/async-neocities-1.1.6.tgz",
"_shasum": "405b45565ccbe9c4ea56e65552ae9c48c20a0309",
"_spec": "async-neocities@1.1.6",
"_where": "/Users/bret/repos/deploy-to-neocities",
"author": {
"name": "Bret Comnes",
@@ -80,5 +80,5 @@
"dist"
]
},
"version": "1.1.5"
"version": "1.1.6"
}

View File

@@ -15,7 +15,7 @@ npm install --save readable-stream
This package is a mirror of the streams implementations in Node.js.
Full documentation may be found on the [Node.js website](https://nodejs.org/dist/v10.18.1/docs/api/stream.html).
Full documentation may be found on the [Node.js website](https://nodejs.org/dist/v10.19.0/docs/api/stream.html).
If you want to guarantee a stable streams base, regardless of what version of
Node you, or the users of your libraries are using, use **readable-stream** *only* and avoid the *"stream"* module in Node-core, for background see [this blogpost](http://r.va.gg/2014/06/why-i-dont-use-nodes-core-stream-module.html).

View File

@@ -6,6 +6,12 @@ function _objectSpread(target) { for (var i = 1; i < arguments.length; i++) { va
function _defineProperty(obj, key, value) { if (key in obj) { Object.defineProperty(obj, key, { value: value, enumerable: true, configurable: true, writable: true }); } else { obj[key] = value; } return obj; }
function _classCallCheck(instance, Constructor) { if (!(instance instanceof Constructor)) { throw new TypeError("Cannot call a class as a function"); } }
function _defineProperties(target, props) { for (var i = 0; i < props.length; i++) { var descriptor = props[i]; descriptor.enumerable = descriptor.enumerable || false; descriptor.configurable = true; if ("value" in descriptor) descriptor.writable = true; Object.defineProperty(target, descriptor.key, descriptor); } }
function _createClass(Constructor, protoProps, staticProps) { if (protoProps) _defineProperties(Constructor.prototype, protoProps); if (staticProps) _defineProperties(Constructor, staticProps); return Constructor; }
var _require = require('buffer'),
Buffer = _require.Buffer;
@@ -22,170 +28,183 @@ module.exports =
/*#__PURE__*/
function () {
function BufferList() {
_classCallCheck(this, BufferList);
this.head = null;
this.tail = null;
this.length = 0;
}
var _proto = BufferList.prototype;
_proto.push = function push(v) {
var entry = {
data: v,
next: null
};
if (this.length > 0) this.tail.next = entry;else this.head = entry;
this.tail = entry;
++this.length;
};
_proto.unshift = function unshift(v) {
var entry = {
data: v,
next: this.head
};
if (this.length === 0) this.tail = entry;
this.head = entry;
++this.length;
};
_proto.shift = function shift() {
if (this.length === 0) return;
var ret = this.head.data;
if (this.length === 1) this.head = this.tail = null;else this.head = this.head.next;
--this.length;
return ret;
};
_proto.clear = function clear() {
this.head = this.tail = null;
this.length = 0;
};
_proto.join = function join(s) {
if (this.length === 0) return '';
var p = this.head;
var ret = '' + p.data;
while (p = p.next) {
ret += s + p.data;
_createClass(BufferList, [{
key: "push",
value: function push(v) {
var entry = {
data: v,
next: null
};
if (this.length > 0) this.tail.next = entry;else this.head = entry;
this.tail = entry;
++this.length;
}
return ret;
};
_proto.concat = function concat(n) {
if (this.length === 0) return Buffer.alloc(0);
var ret = Buffer.allocUnsafe(n >>> 0);
var p = this.head;
var i = 0;
while (p) {
copyBuffer(p.data, ret, i);
i += p.data.length;
p = p.next;
}, {
key: "unshift",
value: function unshift(v) {
var entry = {
data: v,
next: this.head
};
if (this.length === 0) this.tail = entry;
this.head = entry;
++this.length;
}
return ret;
} // Consumes a specified amount of bytes or characters from the buffered data.
;
_proto.consume = function consume(n, hasStrings) {
var ret;
if (n < this.head.data.length) {
// `slice` is the same for buffers and strings.
ret = this.head.data.slice(0, n);
this.head.data = this.head.data.slice(n);
} else if (n === this.head.data.length) {
// First chunk is a perfect match.
ret = this.shift();
} else {
// Result spans more than one buffer.
ret = hasStrings ? this._getString(n) : this._getBuffer(n);
}, {
key: "shift",
value: function shift() {
if (this.length === 0) return;
var ret = this.head.data;
if (this.length === 1) this.head = this.tail = null;else this.head = this.head.next;
--this.length;
return ret;
}
}, {
key: "clear",
value: function clear() {
this.head = this.tail = null;
this.length = 0;
}
}, {
key: "join",
value: function join(s) {
if (this.length === 0) return '';
var p = this.head;
var ret = '' + p.data;
return ret;
};
_proto.first = function first() {
return this.head.data;
} // Consumes a specified amount of characters from the buffered data.
;
_proto._getString = function _getString(n) {
var p = this.head;
var c = 1;
var ret = p.data;
n -= ret.length;
while (p = p.next) {
var str = p.data;
var nb = n > str.length ? str.length : n;
if (nb === str.length) ret += str;else ret += str.slice(0, n);
n -= nb;
if (n === 0) {
if (nb === str.length) {
++c;
if (p.next) this.head = p.next;else this.head = this.tail = null;
} else {
this.head = p;
p.data = str.slice(nb);
}
break;
while (p = p.next) {
ret += s + p.data;
}
++c;
return ret;
}
}, {
key: "concat",
value: function concat(n) {
if (this.length === 0) return Buffer.alloc(0);
var ret = Buffer.allocUnsafe(n >>> 0);
var p = this.head;
var i = 0;
this.length -= c;
return ret;
} // Consumes a specified amount of bytes from the buffered data.
;
_proto._getBuffer = function _getBuffer(n) {
var ret = Buffer.allocUnsafe(n);
var p = this.head;
var c = 1;
p.data.copy(ret);
n -= p.data.length;
while (p = p.next) {
var buf = p.data;
var nb = n > buf.length ? buf.length : n;
buf.copy(ret, ret.length - n, 0, nb);
n -= nb;
if (n === 0) {
if (nb === buf.length) {
++c;
if (p.next) this.head = p.next;else this.head = this.tail = null;
} else {
this.head = p;
p.data = buf.slice(nb);
}
break;
while (p) {
copyBuffer(p.data, ret, i);
i += p.data.length;
p = p.next;
}
++c;
return ret;
} // Consumes a specified amount of bytes or characters from the buffered data.
}, {
key: "consume",
value: function consume(n, hasStrings) {
var ret;
if (n < this.head.data.length) {
// `slice` is the same for buffers and strings.
ret = this.head.data.slice(0, n);
this.head.data = this.head.data.slice(n);
} else if (n === this.head.data.length) {
// First chunk is a perfect match.
ret = this.shift();
} else {
// Result spans more than one buffer.
ret = hasStrings ? this._getString(n) : this._getBuffer(n);
}
return ret;
}
}, {
key: "first",
value: function first() {
return this.head.data;
} // Consumes a specified amount of characters from the buffered data.
this.length -= c;
return ret;
} // Make sure the linked list only shows the minimal necessary information.
;
}, {
key: "_getString",
value: function _getString(n) {
var p = this.head;
var c = 1;
var ret = p.data;
n -= ret.length;
_proto[custom] = function (_, options) {
return inspect(this, _objectSpread({}, options, {
// Only inspect one level.
depth: 0,
// It should not recurse.
customInspect: false
}));
};
while (p = p.next) {
var str = p.data;
var nb = n > str.length ? str.length : n;
if (nb === str.length) ret += str;else ret += str.slice(0, n);
n -= nb;
if (n === 0) {
if (nb === str.length) {
++c;
if (p.next) this.head = p.next;else this.head = this.tail = null;
} else {
this.head = p;
p.data = str.slice(nb);
}
break;
}
++c;
}
this.length -= c;
return ret;
} // Consumes a specified amount of bytes from the buffered data.
}, {
key: "_getBuffer",
value: function _getBuffer(n) {
var ret = Buffer.allocUnsafe(n);
var p = this.head;
var c = 1;
p.data.copy(ret);
n -= p.data.length;
while (p = p.next) {
var buf = p.data;
var nb = n > buf.length ? buf.length : n;
buf.copy(ret, ret.length - n, 0, nb);
n -= nb;
if (n === 0) {
if (nb === buf.length) {
++c;
if (p.next) this.head = p.next;else this.head = this.tail = null;
} else {
this.head = p;
p.data = buf.slice(nb);
}
break;
}
++c;
}
this.length -= c;
return ret;
} // Make sure the linked list only shows the minimal necessary information.
}, {
key: custom,
value: function value(_, options) {
return inspect(this, _objectSpread({}, options, {
// Only inspect one level.
depth: 0,
// It should not recurse.
customInspect: false
}));
}
}]);
return BufferList;
}();

View File

@@ -1,8 +1,8 @@
{
"_from": "readable-stream@^3.1.1",
"_id": "readable-stream@3.5.0",
"_id": "readable-stream@3.6.0",
"_inBundle": false,
"_integrity": "sha512-gSz026xs2LfxBPudDuI41V1lka8cxg64E66SGe78zJlsUofOg/yqwezdIcdfwik6B4h8LFmWPA9ef9X3FiNFLA==",
"_integrity": "sha512-BViHy7LKeTz4oNnkcLJ+lVSL6vpiFeX6/d3oSH8zCW7UxP2onchk+vTGB143xuFjHS3deTgkKoXXymXqymiIdA==",
"_location": "/readable-stream",
"_phantomChildren": {},
"_requested": {
@@ -18,8 +18,8 @@
"_requiredBy": [
"/duplexify"
],
"_resolved": "https://registry.npmjs.org/readable-stream/-/readable-stream-3.5.0.tgz",
"_shasum": "465d70e6d1087f6162d079cd0b5db7fbebfd1606",
"_resolved": "https://registry.npmjs.org/readable-stream/-/readable-stream-3.6.0.tgz",
"_shasum": "337bbda3adc0706bd3e024426a286d4b4b2c9198",
"_spec": "readable-stream@^3.1.1",
"_where": "/Users/bret/repos/deploy-to-neocities/node_modules/duplexify",
"browser": {
@@ -93,5 +93,5 @@
"test-browsers": "airtap --sauce-connect --loopback airtap.local -- test/browser.js",
"update-browser-errors": "babel -o errors-browser.js errors.js"
},
"version": "3.5.0"
"version": "3.6.0"
}

View File

@@ -1,6 +1,6 @@
{
"name": "deploy-to-neocities",
"version": "0.0.11",
"version": "1.0.1",
"description": "Github Action to deplpoy a folder to Neocities.org",
"main": "index.js",
"private": true,
@@ -39,7 +39,7 @@
"dependencies": {
"@actions/core": "1.2.2",
"@actions/github": "2.1.0",
"async-neocities": "1.1.5",
"async-neocities": "1.1.6",
"ms": "^2.1.2"
},
"standard": {