Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
6 changes: 6 additions & 0 deletions .github/dependabot.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
version: 2
updates:
- package-ecosystem: "github-actions"
directory: "./github/workflows"
schedule:
interval: "daily"
58 changes: 58 additions & 0 deletions .github/workflows/external_link_check.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,58 @@
name: Check for dead external links

on:
workflow_dispatch:
schedule:
- cron: "0 0 * * MON"
push:
branches:
- main
pull_request:
branches:
- main

permissions:
contents: read
issues: write

jobs:
external-link-check:
name: External Link Check
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4

- name: Set up Python
uses: actions/setup-python@v4
with:
python-version: '3.x'

- name: Install dependencies
run: |
python -m pip install --upgrade pip
pip install mkdocs mkdocs-material mkdocs-static-i18n[material]

- name: Build
run: mkdocs build

- name: Restore lychee cache
uses: actions/cache@v4
with:
path: .lycheecache
key: cache-lychee-${{ github.sha }}
restore-keys: cache-lychee-

- name: Check offline links
id: lychee-check
uses: lycheeverse/lychee-action@v2.0.2
with:
args: --config lychee.toml --no-progress '**/*.html'
fail: true

- name: Create Issue for dead links
if: steps.lychee-check.outputs.exit_code != 0
uses: peter-evans/create-issue-from-file@v5
with:
title: External Link Checker Report
content-filepath: ./lychee/out.md
labels: report, automated issue
50 changes: 50 additions & 0 deletions .github/workflows/offline_link_check.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,50 @@
name: Check for dead internal links

on:
push:
branches:
- main
pull_request:
branches:
- main
schedule:
- cron: "00 20 * * *"

permissions:
contents: read
issues: write

jobs:
link-check:
name: Internal Link Check
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4

- name: Set up Python
uses: actions/setup-python@v4
with:
python-version: '3.x'

- name: Install dependencies
run: |
python -m pip install --upgrade pip
pip install mkdocs mkdocs-material mkdocs-static-i18n[material]

- name: Build
run: mkdocs build

- name: Check offline links
id: lychee-check
uses: lycheeverse/lychee-action@v2.0.2
with:
args: --config lychee.toml --no-progress --offline '**/*.html'
fail: true

- name: Create Issue for dead links
if: steps.lychee-check.outputs.exit_code != 0
uses: peter-evans/create-issue-from-file@v5
with:
title: Internal Link Checker Report
content-filepath: ./lychee/out.md
labels: report, automated issue
3 changes: 3 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,9 @@

[![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg)](https://opensource.org/licenses/MIT)
[![License: CC BY 4.0](https://img.shields.io/badge/License-CC_BY_4.0-lightgrey.svg)](https://creativecommons.org/licenses/by/4.0/)
[![Check internal
Links](https://github.com/fairagro/knowledgebase/actions/workflows/links.yml/badge.svg)](https://github.com/fairagro/knowledgebase/actions/workflows/links.yml)


This is the GitHub Repo of the [FAIRagro](https://fairagro.net/en) Knowledge Base, operated by the FAIRagro [DSSC](https://fairagro.net/en/helpdesk) (Data Steward Service Center).
The Knowledge Base is deployed via GitHub-Pages using [<img src="https://upload.wikimedia.org/wikipedia/commons/9/91/Octicons-mark-github.svg" alt="GitHub Icon" width="16"/>](https://github.com/squidfunk/mkdocs-material) [mkdocs-material](https://github.com/squidfunk/mkdocs-material).
Expand Down
12 changes: 12 additions & 0 deletions lychee.toml
Original file line number Diff line number Diff line change
@@ -0,0 +1,12 @@
exclude_path = [
"site/custom/partials/",
"docs/custom/partials/"
]
cache = true
max_cache_age = "2w"
user_agent = "curl/8.11.1"
timeout = 20
max_retries = 2
retry_wait_time = 5
require_https = true
exclude_mail = true
Loading