Skip to content

NguyenTuanCanh/fleek-network-docs

 
 

Repository files navigation

Fleek Network docs

Conventional Commits

The Fleek Network documentation and guides source.

🤖 Installation

yarn

🏠 Local Development

yarn start

This command starts a local development server and opens up a browser window. Most changes are reflected live without having to restart the server.

👷 Build

yarn build

This command generates static content into the build directory and can be served using any static contents hosting service.

🕸 Crawler

Crawls are handled by the [Algolia Crawler and are scheduled to run once a week by default. You can then trigger new crawls yourself and monitor them directly from the Crawler interface, which also offers a live editor where you can maintain your config.

For this reason, crawling on CI deployment is disabled and can be enabled if moved from docsearch open-source license to a paid account.

🚀 Deployment

The documentation site is the static output result of the build command (as directory build).

A Github workflow is set up to build and publish to gh_pages, which when committed to gh_pages, the pages-build-deployment triggers.

Any new commit into the main branch will trigger the Deploy (Github pages) action. For this reason, to publish a new build all you have to do is to commit to main branch.

Alternatively, to publish manually to gh_pages use the deploy command. Here we prefix the command with the optional variables.

USE_SSH=true GIT_USER=<Your github username> yarn deploy

💡 The command requires you to have Git authenticated via ssh.

🕸 Web Crawl

⚠️ The crawl:docker will not work for now as we moved to the open source version of Algolia called Docsearch and the Crawler has to be triggered manually through their dashboard or awaited for the scheduled job. If preferred to use the crawl:docker in the CI, a paid subscription is required.

To trigger manually read here

CLI version, requires a paid subscription

The web crawler or spider is used to search and automatically index website content. The process can run periodically by docsearch but if you much prefer you can run it manually.

You can run a crawl from the packaged Docker image to crawl your website.

You'll need to have installed:

  • jq (command-line JSON processor)
  • Docker

Also, have a dotenv (.env) with the following:

APPLICATION_ID=<YOUR APP ID>
API_KEY=<YOUR API KEY>

Then you need to start the crawl according to your configuration.

yarn crawl:docker

👩‍🎨 Custom domain

A custom domain (cloudflare) is setup to point to github pages as docs.fleek.network.

The docusaurus.config.js and config.docsearch.json are set to use the custom domain. There's another file to persist the custom name for github pages, the static.CNAME that should contain the docs.fleek.network custom domain.

This is important as otherwise, the DNS checkup would fail!

📖 Version

WIP

🙏 Contribution guideline

Create branches from the main branch and name it in accordance to conventional commits here, or follow the examples bellow:

test: 💍 Adding missing tests
feat: 🎸 A new feature
fix: 🐛 A bug fix
chore: 🤖 Build process or auxiliary tool changes
docs: ✏️ Documentation only changes
refactor: 💡 A code change that neither fixes a bug or adds a feature
style: 💄 Markup, white-space, formatting, missing semi-colons...

Find more about contributing here, please!

About

The Fleek Network documentation and guides source

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • JavaScript 38.7%
  • CSS 28.9%
  • MDX 26.8%
  • TypeScript 5.2%
  • Other 0.4%