Compare commits
3 Commits
feature/up
...
mddb-launc
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
bfbf0f5304 | ||
|
|
0db2055da3 | ||
|
|
d96d5555a0 |
8
.vscode/extensions.json
vendored
Normal file
8
.vscode/extensions.json
vendored
Normal file
@@ -0,0 +1,8 @@
|
||||
{
|
||||
"recommendations": [
|
||||
"nrwl.angular-console",
|
||||
"esbenp.prettier-vscode",
|
||||
"firsttris.vscode-jest-runner",
|
||||
"dbaeumer.vscode-eslint"
|
||||
]
|
||||
}
|
||||
@@ -4,7 +4,7 @@ title: Developer docs for contributors
|
||||
|
||||
## Our repository
|
||||
|
||||
https://github.com/datopian/datahub
|
||||
https://github.com/datopian/portaljs
|
||||
|
||||
Structure:
|
||||
|
||||
@@ -17,7 +17,7 @@ Structure:
|
||||
|
||||
## How to contribute
|
||||
|
||||
You can start by checking our [issues board](https://github.com/datopian/datahub/issues).
|
||||
You can start by checking our [issues board](https://github.com/datopian/portaljs/issues).
|
||||
|
||||
If you'd like to work on one of the issues you can:
|
||||
|
||||
@@ -35,7 +35,7 @@ If you'd like to work on one of the issues you can:
|
||||
If you have an idea for improvement, and it doesn't have a corresponding issue yet, simply submit a new one.
|
||||
|
||||
> [!note]
|
||||
> Join our [Discord channel](https://discord.gg/KZSf3FG4EZ) do discuss existing issues and to ask for help.
|
||||
> Join our [Discord channel](https://discord.gg/rTxfCutu) do discuss existing issues and to ask for help.
|
||||
|
||||
## Nx
|
||||
|
||||
|
||||
77
README.md
77
README.md
@@ -1,56 +1,31 @@
|
||||
<h1 align="center">
|
||||
<a href="https://datahub.io/">
|
||||
<img alt="datahub" src="http://datahub.io/datahub-cube.svg" width="146">
|
||||
</a>
|
||||
🌀 Portal.JS
|
||||
<br />
|
||||
Rapidly build rich data portals using a modern frontend framework
|
||||
</h1>
|
||||
|
||||
<p align="center">
|
||||
Bugs, issues and suggestions re DataHub Cloud ☁️ and DataHub OpenSource 🌀
|
||||
<br />
|
||||
<br /><a href="https://discord.gg/xfFDMPU9dC"><img src="https://dcbadge.vercel.app/api/server/xfFDMPU9dC" /></a>
|
||||
</p>
|
||||
* [What is Portal.JS ?](#What-is-Portal.JS)
|
||||
* [Features](#Features)
|
||||
* [For developers](#For-developers)
|
||||
* [Docs](#Docs)
|
||||
* [Community](#Community)
|
||||
* [Appendix](#Appendix)
|
||||
* [What happened to Recline?](#What-happened-to-Recline?)
|
||||
|
||||
## DataHub
|
||||
# What is Portal.JS
|
||||
|
||||
This repo and issue tracker are for
|
||||
🌀 Portal.JS is a framework for rapidly building rich data portal frontends using a modern frontend approach. Portal.JS can be used to present a single dataset or build a full-scale data catalog/portal.
|
||||
|
||||
- DataHub Cloud ☁️ - https://datahub.io/
|
||||
- DataHub 🌀 - https://datahub.io/opensource
|
||||
Built in JavaScript and React on top of the popular [Next.js](https://nextjs.com/) framework. Portal.JS assumes a "decoupled" approach where the frontend is a separate service from the backend and interacts with backend(s) via an API. It can be used with any backend and has out of the box support for [CKAN](https://ckan.org/).
|
||||
|
||||
### Issues
|
||||
|
||||
Found a bug: 👉 https://github.com/datopian/datahub/issues/new
|
||||
|
||||
### Discussions
|
||||
|
||||
Got a suggestion, a question, want some support or just want to shoot the breeze 🙂
|
||||
|
||||
Head to the discussion forum: 👉 https://github.com/datopian/datahub/discussions
|
||||
|
||||
### Chat on Discord
|
||||
|
||||
If you would prefer to get help via live chat check out our discord 👉
|
||||
|
||||
[Discord](https://discord.gg/xfFDMPU9dC)
|
||||
|
||||
### Docs
|
||||
|
||||
https://datahub.io/docs
|
||||
|
||||
## DataHub OpenSource 🌀
|
||||
|
||||
DataHub 🌀 is a platform for rapidly creating rich data portal and publishing systems using a modern frontend approach. Datahub can be used to publish a single dataset or build a full-scale data catalog/portal.
|
||||
|
||||
DataHub is built in JavaScript and React on top of the popular [Next.js](https://nextjs.org) framework. DataHub assumes a "decoupled" approach where the frontend is a separate service from the backend and interacts with backend(s) via an API. It can be used with any backend and has out of the box support for [CKAN](https://ckan.org/), GitHub, Frictionless Data Packages and more.
|
||||
|
||||
### Features
|
||||
## Features
|
||||
|
||||
- 🗺️ Unified sites: present data and content in one seamless site, pulling datasets from a DMS (e.g. CKAN) and content from a CMS (e.g. Wordpress) with a common internal API.
|
||||
- 👩💻 Developer friendly: built with familiar frontend tech (JavaScript, React, Next.js).
|
||||
- 🔋 Batteries included: full set of portal components out of the box e.g. catalog search, dataset showcase, blog, etc.
|
||||
- 🎨 Easy to theme and customize: installable themes, use standard CSS and React+CSS tooling. Add new routes quickly.
|
||||
- 🧱 Extensible: quickly extend and develop/import your own React components
|
||||
- 📝 Well documented: full set of documentation plus the documentation of Next.js.
|
||||
- 📝 Well documented: full set of documentation plus the documentation of Next.js and Apollo.
|
||||
|
||||
### For developers
|
||||
|
||||
@@ -58,3 +33,25 @@ DataHub is built in JavaScript and React on top of the popular [Next.js](https:/
|
||||
- 🚀 Next.js framework: so everything in Next.js for free: Server Side Rendering, Static Site Generation, huge number of examples and integrations, etc.
|
||||
- Server Side Rendering (SSR) => Unlimited number of pages, SEO and more whilst still using React.
|
||||
- Static Site Generation (SSG) => Ultra-simple deployment, great performance, great lighthouse scores and more (good for small sites)
|
||||
|
||||
#### **Check out the [Portal.JS website](https://portaljs.org/) for a gallery of live portals**
|
||||
|
||||
___
|
||||
|
||||
# Docs
|
||||
|
||||
Access the Portal.JS documentation at:
|
||||
|
||||
https://portaljs.org/docs
|
||||
|
||||
- [Examples](https://portaljs.org/docs#examples)
|
||||
|
||||
# Community
|
||||
|
||||
If you have questions about anything related to Portal.JS, you're always welcome to ask our community on [GitHub Discussions](https://github.com/datopian/portal.js/discussions) or on our [Discord server](https://discord.gg/EeyfGrGu4U).
|
||||
|
||||
# Appendix
|
||||
|
||||
## What happened to Recline?
|
||||
|
||||
Portal.JS used to be Recline(JS). If you are looking for the old Recline codebase it still exists: see the [`recline` branch](https://github.com/datopian/portal.js/tree/recline). If you want context for the rename see [this issue](https://github.com/datopian/portal.js/issues/520).
|
||||
|
||||
@@ -2,7 +2,7 @@
|
||||
|
||||
**🚩 UPDATE April 2023: This example is now deprecated - though still works!. Please use the [new CKAN examples](https://github.com/datopian/portaljs/tree/main/examples)**
|
||||
|
||||
This example shows how you can build a full data portal using a CKAN Backend with a Next.JS Frontend powered by Apollo, a full fledged guide is available as a [blog post](https://portaljs.com/blog/example-ckan-2021)
|
||||
This example shows how you can build a full data portal using a CKAN Backend with a Next.JS Frontend powered by Apollo, a full fledged guide is available as a [blog post](https://portaljs.org/blog/example-ckan-2021)
|
||||
|
||||
## Developers
|
||||
|
||||
|
||||
@@ -1,7 +1,7 @@
|
||||
This is a repo intended to serve as an example of a data catalog that get its data from a CKAN Instance.
|
||||
|
||||
```
|
||||
npx create-next-app <app-name> --example https://github.com/datopian/datahub/tree/main/examples/ckan-ssg
|
||||
npx create-next-app <app-name> --example https://github.com/datopian/portaljs/tree/main/examples/ckan-example
|
||||
cd <app-name>
|
||||
```
|
||||
|
||||
@@ -19,7 +19,7 @@ npm run dev
|
||||
|
||||
Congratulations, you now have something similar to this running on `http://localhost:4200`
|
||||

|
||||
If you go to any one of those pages by clicking on `More info` you will see something similar to this
|
||||
If yo go to any one of those pages by clicking on `More info` you will see something similar to this
|
||||

|
||||
|
||||
## Deployment
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
This example creates a portal/showcase for a single dataset. The dataset should be a [Frictionless dataset (data package)][fd] i.e. there should be a `datapackage.json`.
|
||||
|
||||
[fd]: https://specs.frictionlessdata.io/data-package/
|
||||
[fd]: https://frictionlessdata.io/data-packages/
|
||||
|
||||
## How to use
|
||||
|
||||
|
||||
@@ -1,9 +1,3 @@
|
||||
# PortalJS Demo replicating the FiveThirtyEight data portal
|
||||
|
||||
## 👉 https://fivethirtyeight.portaljs.org 👈
|
||||
|
||||
Here's a blog post we wrote about it: https://www.datopian.com/blog/fivethirtyeight-replica
|
||||
|
||||
This is a replica of the awesome data.fivethirtyeight.com using PortalJS.
|
||||
|
||||
You might be asking why we did that, there are three main reasons:
|
||||
|
||||
@@ -59,7 +59,7 @@ export default function Layout({ children }: { children: React.ReactNode }) {
|
||||
<div className="md:flex items-center gap-x-3 text-[#3c3c3c] -mb-1 hidden">
|
||||
<a
|
||||
className="hover:opacity-75 transition"
|
||||
href="https://portaljs.com"
|
||||
href="https://portaljs.org"
|
||||
>
|
||||
Built with 🌀PortalJS
|
||||
</a>
|
||||
@@ -77,7 +77,7 @@ export default function Layout({ children }: { children: React.ReactNode }) {
|
||||
<li>
|
||||
<a
|
||||
className="hover:opacity-75 transition"
|
||||
href="https://portaljs.com"
|
||||
href="https://portaljs.org"
|
||||
>
|
||||
PortalJS
|
||||
</a>
|
||||
|
||||
@@ -6,7 +6,7 @@ A `datasets.json` file is used to specify which datasets are going to be part of
|
||||
|
||||
The application contains an index page, which lists all the datasets specified in the `datasets.json` file, and users can see more information about each dataset, such as the list of data files in it and the README, by clicking the "info" button on the list.
|
||||
|
||||
You can read more about it on the [Data catalog with data on GitHub](https://portaljs.com/docs/examples/github-backed-catalog) blog post.
|
||||
You can read more about it on the [Data catalog with data on GitHub](https://portaljs.org/docs/examples/github-backed-catalog) blog post.
|
||||
|
||||
## Demo
|
||||
|
||||
|
||||
@@ -40,7 +40,7 @@ export function Datasets({ projects }) {
|
||||
<Link
|
||||
target="_blank"
|
||||
className="underline"
|
||||
href="https://portaljs.com/"
|
||||
href="https://portaljs.org/"
|
||||
>
|
||||
🌀 PortalJS
|
||||
</Link>
|
||||
|
||||
@@ -1 +1 @@
|
||||
PortalJS Learn Example - https://portaljs.com/docs
|
||||
PortalJS Learn Example - https://portaljs.org/docs
|
||||
@@ -6,7 +6,7 @@ A `datasets.json` file is used to specify which datasets are going to be part of
|
||||
|
||||
The application contains an index page, which lists all the datasets specified in the `datasets.json` file, and users can see more information about each dataset, such as the list of data files in it and the README, by clicking the "info" button on the list.
|
||||
|
||||
You can read more about it on the [Data catalog with data on GitHub](https://portaljs.com/docs/examples/github-backed-catalog) blog post.
|
||||
You can read more about it on the [Data catalog with data on GitHub](https://portaljs.org/docs/examples/github-backed-catalog) blog post.
|
||||
|
||||
## Demo
|
||||
|
||||
|
||||
@@ -17,7 +17,7 @@ export default function Footer() {
|
||||
</a>
|
||||
</div>
|
||||
<div className="flex gap-x-2 items-center mx-auto h-20">
|
||||
<p className="mt-8 text-base text-slate-500 md:mt-0">Built with <a href="https://portaljs.com" target="_blank" className='text-xl font-medium'>🌀 PortalJS</a></p>
|
||||
<p className="mt-8 text-base text-slate-500 md:mt-0">Built with <a href="https://portaljs.org" target="_blank" className='text-xl font-medium'>🌀 PortalJS</a></p>
|
||||
</div>
|
||||
</div>
|
||||
</footer>
|
||||
|
||||
@@ -127,4 +127,4 @@ Based on the bar chart above we can conclude that the following 3 countries have
|
||||
2. Poland - EUR ~68b.
|
||||
3. Italy - EUR ~35b.
|
||||
|
||||
_This data story was created by using Datopian's PortalJS framework. You can learn more about the framework by visiting https://portaljs.com/_
|
||||
_This data story was created by using Datopian's PortalJS framework. You can learn more about the framework by visiting https://portaljs.org/_
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
This demo data portal is designed for https://hatespeechdata.com. It catalogs datasets annotated for hate speech, online abuse, and offensive language which are useful for training a natural language processing system to detect this online abuse.
|
||||
|
||||
The site is built on top of [PortalJS](https://portaljs.com/). It catalogs datasets and lists of offensive keywords. It also includes static pages. All of these are stored as markdown files inside the `content` folder.
|
||||
The site is built on top of [PortalJS](https://portaljs.org/). It catalogs datasets and lists of offensive keywords. It also includes static pages. All of these are stored as markdown files inside the `content` folder.
|
||||
|
||||
- .md files inside `content/datasets/` will appear on the dataset list section of the homepage and be searchable as well as having a individual page in `datasets/<file name>`
|
||||
- .md files inside `content/keywords/` will appear on the list of offensive keywords section of the homepage as well as having a individual page in `keywords/<file name>`
|
||||
|
||||
@@ -21,7 +21,7 @@ export function Footer() {
|
||||
<Container.Inner>
|
||||
<div className="flex flex-col items-center justify-between gap-6 sm:flex-row">
|
||||
<p className="text-sm font-medium text-zinc-800 dark:text-zinc-200">
|
||||
Built with <a href='https://portaljs.com'>PortalJS 🌀</a>
|
||||
Built with <a href='https://portaljs.org'>PortalJS 🌀</a>
|
||||
</p>
|
||||
<p className="text-sm text-zinc-400 dark:text-zinc-500">
|
||||
© {new Date().getFullYear()} Leon Derczynski. All rights
|
||||
|
||||
3522
package-lock.json
generated
3522
package-lock.json
generated
File diff suppressed because it is too large
Load Diff
@@ -2,7 +2,7 @@
|
||||
"name": "@portaljs/ckan",
|
||||
"version": "0.1.0",
|
||||
"type": "module",
|
||||
"description": "https://portaljs.com",
|
||||
"description": "https://portaljs.org",
|
||||
"keywords": [
|
||||
"data portal",
|
||||
"data catalog",
|
||||
|
||||
@@ -1,16 +1,9 @@
|
||||
import 'tailwindcss/tailwind.css'
|
||||
import '../src/index.css'
|
||||
|
||||
|
||||
import type { Preview } from '@storybook/react';
|
||||
|
||||
window.process = {
|
||||
...window.process,
|
||||
env:{
|
||||
...window.process?.env,
|
||||
|
||||
|
||||
}
|
||||
};
|
||||
|
||||
const preview: Preview = {
|
||||
parameters: {
|
||||
actions: { argTypesRegex: '^on[A-Z].*' },
|
||||
|
||||
@@ -1,115 +1,5 @@
|
||||
# @portaljs/components
|
||||
|
||||
## 1.2.2
|
||||
|
||||
### Patch Changes
|
||||
|
||||
- [`eeb480e8`](https://github.com/datopian/datahub/commit/eeb480e8cff2d11072ace55ad683a65f54f5d07a) Thanks [@olayway](https://github.com/olayway)! - Adjust `xAxisTimeUnit` property in LineChart to allow for passing `yearmonth`.
|
||||
|
||||
## 1.2.1
|
||||
|
||||
### Patch Changes
|
||||
|
||||
- [`836b143a`](https://github.com/datopian/datahub/commit/836b143a3178b893b1aae3fb511d795dd3a63545) Thanks [@olayway](https://github.com/olayway)! - Fix: make tileLayerName in Map optional.
|
||||
|
||||
## 1.2.0
|
||||
|
||||
### Minor Changes
|
||||
|
||||
- [#1338](https://github.com/datopian/datahub/pull/1338) [`63d9e3b7`](https://github.com/datopian/datahub/commit/63d9e3b7543c38154e6989ef1cc1d694ae9fc4f8) Thanks [@olayway](https://github.com/olayway)! - Support for plotting multiple series in LineChart component.
|
||||
|
||||
## 1.1.0
|
||||
|
||||
### Minor Changes
|
||||
|
||||
- [#1122](https://github.com/datopian/datahub/pull/1122) [`8e349678`](https://github.com/datopian/datahub/commit/8e3496782c022b0653e07f217c6b315ba84e0e61) Thanks [@willy1989cv](https://github.com/willy1989cv)! - Map: allow users to choose a base layer setting
|
||||
|
||||
## 1.0.1
|
||||
|
||||
### Patch Changes
|
||||
|
||||
- [#1170](https://github.com/datopian/datahub/pull/1170) [`9ff25ed7`](https://github.com/datopian/datahub/commit/9ff25ed7c47c8c02cc078c64f76ae35d6754c508) Thanks [@lucasmbispo](https://github.com/lucasmbispo)! - iFrame component: change height
|
||||
|
||||
## 1.0.0
|
||||
|
||||
### Major Changes
|
||||
|
||||
- [#1103](https://github.com/datopian/datahub/pull/1103) [`48cd812a`](https://github.com/datopian/datahub/commit/48cd812a488a069a419d8ecc67f24f94d4d1d1d6) Thanks [@demenech](https://github.com/demenech)! - Components API tidying up and storybook docs improvements.
|
||||
|
||||
## 0.6.0
|
||||
|
||||
### Minor Changes
|
||||
|
||||
- [`a044f56e`](https://github.com/datopian/portaljs/commit/a044f56e3cbe0519ddf9d24d78b0bb7eac917e1c) Thanks [@luccasmmg](https://github.com/luccasmmg)! - Added plotly components
|
||||
|
||||
## 0.5.10
|
||||
|
||||
### Patch Changes
|
||||
|
||||
- [#1083](https://github.com/datopian/portaljs/pull/1083) [`86a2945e`](https://github.com/datopian/portaljs/commit/86a2945ee68dfcea0299984ca9cc9070d68fe1c2) Thanks [@Gutts-n](https://github.com/Gutts-n)! - Created integration with datastore api for table component
|
||||
|
||||
## 0.5.9
|
||||
|
||||
### Patch Changes
|
||||
|
||||
- [#1081](https://github.com/datopian/portaljs/pull/1081) [`2bbf3134`](https://github.com/datopian/portaljs/commit/2bbf3134896df3ecc66560bdf95bece143614c7b) Thanks [@Gutts-n](https://github.com/Gutts-n)! - Fixed error to remove anchor from document
|
||||
|
||||
## 0.5.8
|
||||
|
||||
### Patch Changes
|
||||
|
||||
- [#1079](https://github.com/datopian/portaljs/pull/1079) [`058d2367`](https://github.com/datopian/portaljs/commit/058d23678a024890f8a6d909ded9fc8fc11cf145) Thanks [@Gutts-n](https://github.com/Gutts-n)! - Changed the download behaviour of the bucket viewer component and removed loading component while downloading
|
||||
|
||||
## 0.5.7
|
||||
|
||||
### Patch Changes
|
||||
|
||||
- [#1077](https://github.com/datopian/portaljs/pull/1077) [`6d7acd27`](https://github.com/datopian/portaljs/commit/6d7acd27ed9299cbcc14eab906f2f0eb414656b8) Thanks [@Gutts-n](https://github.com/Gutts-n)! - Created property to present a component while is loading the download of the file and fixed download bug on pagination
|
||||
|
||||
## 0.5.6
|
||||
|
||||
### Patch Changes
|
||||
|
||||
- [#1075](https://github.com/datopian/portaljs/pull/1075) [`26dcffc2`](https://github.com/datopian/portaljs/commit/26dcffc279057f80a579134e862085ba042c06c3) Thanks [@Gutts-n](https://github.com/Gutts-n)! - Fixed problem presenting the download component in the first load of the bucket viewer
|
||||
|
||||
## 0.5.5
|
||||
|
||||
### Patch Changes
|
||||
|
||||
- [#1073](https://github.com/datopian/portaljs/pull/1073) [`cf24042a`](https://github.com/datopian/portaljs/commit/cf24042a910567e98eeb75ade42ce0149bdb62d1) Thanks [@Gutts-n](https://github.com/Gutts-n)! - Fixed filter by startDate error
|
||||
|
||||
## 0.5.4
|
||||
|
||||
### Patch Changes
|
||||
|
||||
- [#1071](https://github.com/datopian/portaljs/pull/1071) [`27c99add`](https://github.com/datopian/portaljs/commit/27c99adde8fa36ad2c2e03f227f93aa62454eefa) Thanks [@Gutts-n](https://github.com/Gutts-n)! - Added pagination and filter properties for the BucketViewer component
|
||||
|
||||
## 0.5.3
|
||||
|
||||
### Patch Changes
|
||||
|
||||
- [#1066](https://github.com/datopian/portaljs/pull/1066) [`dd03a493`](https://github.com/datopian/portaljs/commit/dd03a493beca5459d1ef447b2df505609fc64e95) Thanks [@Gutts-n](https://github.com/Gutts-n)! - Created Iframe component
|
||||
|
||||
## 0.5.2
|
||||
|
||||
### Patch Changes
|
||||
|
||||
- [#1063](https://github.com/datopian/portaljs/pull/1063) [`b13e3ade`](https://github.com/datopian/portaljs/commit/b13e3ade3ccefe7dffe84f824bdedd3e512ce499) Thanks [@Gutts-n](https://github.com/Gutts-n)! - Created auto zoom configuration for the map component
|
||||
|
||||
## 0.5.1
|
||||
|
||||
### Patch Changes
|
||||
|
||||
- [#1061](https://github.com/datopian/portaljs/pull/1061) [`4ddfc112`](https://github.com/datopian/portaljs/commit/4ddfc1126a3f0b8137ea47a08a36c56b7373b8f6) Thanks [@Gutts-n](https://github.com/Gutts-n)! - Created the style property in the Map component
|
||||
|
||||
## 0.5.0
|
||||
|
||||
### Minor Changes
|
||||
|
||||
- [#1055](https://github.com/datopian/portaljs/pull/1055) [`712f4a3b`](https://github.com/datopian/portaljs/commit/712f4a3b0f074e654879bb75059f51e06b422b32) Thanks [@Gutts-n](https://github.com/Gutts-n)! - Creation of BucketViewer component to show the data of public buckets
|
||||
|
||||
- [#1057](https://github.com/datopian/portaljs/pull/1057) [`61c750b7`](https://github.com/datopian/portaljs/commit/61c750b7e11fe52bf04d25f192440ee1bb307404) Thanks [@Gutts-n](https://github.com/Gutts-n)! - Exporting BucketViewer to be accessed out of the folder
|
||||
|
||||
## 0.4.0
|
||||
|
||||
### Minor Changes
|
||||
|
||||
@@ -1,7 +1,7 @@
|
||||
# PortalJS React Components
|
||||
|
||||
**Storybook:** https://storybook.portaljs.org
|
||||
**Docs**: https://portaljs.com/opensource
|
||||
**Docs**: https://portaljs.org/docs
|
||||
|
||||
## Usage
|
||||
|
||||
|
||||
@@ -1,8 +1,8 @@
|
||||
{
|
||||
"name": "@portaljs/components",
|
||||
"version": "1.2.2",
|
||||
"version": "0.4.0",
|
||||
"type": "module",
|
||||
"description": "https://portaljs.com",
|
||||
"description": "https://portaljs.org",
|
||||
"keywords": [
|
||||
"data portal",
|
||||
"data catalog",
|
||||
@@ -29,8 +29,6 @@
|
||||
"@githubocto/flat-ui": "^0.14.1",
|
||||
"@heroicons/react": "^2.0.17",
|
||||
"@planet/maps": "^8.1.0",
|
||||
"@react-pdf-viewer/core": "3.6.0",
|
||||
"@react-pdf-viewer/default-layout": "3.6.0",
|
||||
"@tanstack/react-table": "^8.8.5",
|
||||
"ag-grid-react": "^30.0.4",
|
||||
"chroma-js": "^2.4.2",
|
||||
@@ -39,19 +37,19 @@
|
||||
"next-mdx-remote": "^4.4.1",
|
||||
"ol": "^7.4.0",
|
||||
"papaparse": "^5.4.1",
|
||||
"pdfjs-dist": "2.15.349",
|
||||
"plotly.js": "^2.30.1",
|
||||
"postcss-url": "^10.1.3",
|
||||
"react": "^18.2.0",
|
||||
"react-dom": "^18.2.0",
|
||||
"react-hook-form": "^7.43.9",
|
||||
"react-leaflet": "^4.2.1",
|
||||
"react-plotly.js": "^2.6.0",
|
||||
"react-query": "^3.39.3",
|
||||
"react-vega": "^7.6.0",
|
||||
"vega": "5.25.0",
|
||||
"vega-lite": "5.1.0",
|
||||
"vitest": "^0.31.4",
|
||||
"@react-pdf-viewer/core": "3.6.0",
|
||||
"@react-pdf-viewer/default-layout": "3.6.0",
|
||||
"pdfjs-dist": "2.15.349",
|
||||
"xlsx": "^0.18.5"
|
||||
},
|
||||
"devDependencies": {
|
||||
|
||||
@@ -1,222 +0,0 @@
|
||||
import { CSSProperties, ReactNode, useEffect, useState } from 'react';
|
||||
import LoadingSpinner from './LoadingSpinner';
|
||||
|
||||
export interface BucketViewerFilterSearchedDataEvent {
|
||||
startDate?: Date;
|
||||
endDate?: Date;
|
||||
}
|
||||
|
||||
export interface BucketViewerProps {
|
||||
onLoadTotalNumberOfItems?: (total: number) => void;
|
||||
domain: string;
|
||||
downloadConfig?: {
|
||||
hoverOfTheFileComponent?: ReactNode;
|
||||
};
|
||||
suffix?: string;
|
||||
className?: string;
|
||||
paginationConfig?: BucketViewerPaginationConfig;
|
||||
filterState?: BucketViewerFilterSearchedDataEvent;
|
||||
dataMapperFn: (rawData: Response) => Promise<BucketViewerData[]>;
|
||||
}
|
||||
|
||||
export interface BucketViewerPaginationConfig {
|
||||
containerClassName?: string;
|
||||
containerStyles?: CSSProperties;
|
||||
itemsPerPage: number;
|
||||
}
|
||||
|
||||
export interface BucketViewerData {
|
||||
fileName: string;
|
||||
downloadFileUri: string;
|
||||
dateProps?: {
|
||||
date: Date;
|
||||
dateFormatter?: (date: Date) => string;
|
||||
};
|
||||
}
|
||||
|
||||
export function BucketViewer({
|
||||
domain,
|
||||
suffix,
|
||||
dataMapperFn,
|
||||
className,
|
||||
filterState,
|
||||
paginationConfig,
|
||||
downloadConfig,
|
||||
onLoadTotalNumberOfItems,
|
||||
}: BucketViewerProps) {
|
||||
suffix = suffix ?? '/';
|
||||
|
||||
const { hoverOfTheFileComponent } = downloadConfig ?? {};
|
||||
const [isLoading, setIsLoading] = useState<boolean>(false);
|
||||
const [showDownloadComponentOnLine, setShowDownloadComponentOnLine] =
|
||||
useState(-1);
|
||||
const [currentPage, setCurrentPage] = useState<number>(0);
|
||||
const [lastPage, setLastPage] = useState<number>(0);
|
||||
const [bucketFiles, setBucketFiles] = useState<BucketViewerData[]>([]);
|
||||
const [paginatedData, setPaginatedData] = useState<BucketViewerData[]>([]);
|
||||
const [filteredData, setFilteredData] = useState<BucketViewerData[]>([]);
|
||||
|
||||
useEffect(() => {
|
||||
setIsLoading(true);
|
||||
fetch(`${domain}${suffix}`)
|
||||
.then((res) => dataMapperFn(res))
|
||||
.then((data) => {
|
||||
setBucketFiles(data);
|
||||
setFilteredData(data);
|
||||
})
|
||||
.finally(() => setIsLoading(false));
|
||||
}, [domain, suffix]);
|
||||
|
||||
useEffect(() => {
|
||||
if (paginationConfig) {
|
||||
const startIndex = paginationConfig
|
||||
? currentPage * paginationConfig.itemsPerPage
|
||||
: 0;
|
||||
|
||||
const endIndex = paginationConfig
|
||||
? startIndex + paginationConfig.itemsPerPage
|
||||
: 0;
|
||||
|
||||
setLastPage(
|
||||
Math.ceil(filteredData.length / paginationConfig.itemsPerPage) - 1
|
||||
);
|
||||
setPaginatedData(filteredData.slice(startIndex, endIndex));
|
||||
}
|
||||
}, [currentPage, filteredData]);
|
||||
|
||||
useEffect(() => {
|
||||
if (onLoadTotalNumberOfItems) onLoadTotalNumberOfItems(filteredData.length);
|
||||
}, [filteredData]);
|
||||
|
||||
useEffect(() => {
|
||||
if (!filterState) return;
|
||||
|
||||
if (filterState.startDate && filterState.endDate) {
|
||||
setFilteredData(
|
||||
bucketFiles.filter(({ dateProps }) =>
|
||||
dateProps
|
||||
? dateProps.date.getTime() >= filterState.startDate.getTime() &&
|
||||
dateProps.date.getTime() <= filterState.endDate.getTime()
|
||||
: true
|
||||
)
|
||||
);
|
||||
} else if (filterState.startDate) {
|
||||
setFilteredData(
|
||||
bucketFiles.filter(({ dateProps }) =>
|
||||
dateProps
|
||||
? dateProps.date.getTime() >= filterState.startDate.getTime()
|
||||
: true
|
||||
)
|
||||
);
|
||||
} else if (filterState.endDate) {
|
||||
setFilteredData(
|
||||
bucketFiles.filter(({ dateProps }) =>
|
||||
dateProps
|
||||
? dateProps.date.getTime() <= filterState.endDate.getTime()
|
||||
: true
|
||||
)
|
||||
);
|
||||
} else {
|
||||
setFilteredData(bucketFiles);
|
||||
}
|
||||
}, [filterState]);
|
||||
|
||||
return isLoading ? (
|
||||
<div className="w-full flex items-center justify-center h-[300px]">
|
||||
<LoadingSpinner />
|
||||
</div>
|
||||
) : bucketFiles ? (
|
||||
<>
|
||||
{...(paginationConfig && bucketFiles ? paginatedData : filteredData)?.map(
|
||||
(data, i) => (
|
||||
<ul
|
||||
onClick={() => {
|
||||
const a: HTMLAnchorElement = document.createElement('a');
|
||||
a.href = data.downloadFileUri;
|
||||
a.target = `_blank`;
|
||||
a.download = data.fileName;
|
||||
document.body.appendChild(a);
|
||||
a.click();
|
||||
document.body.removeChild(a);
|
||||
}}
|
||||
key={i}
|
||||
onMouseEnter={() => setShowDownloadComponentOnLine(i)}
|
||||
onMouseLeave={() => setShowDownloadComponentOnLine(undefined)}
|
||||
className={`${
|
||||
className ??
|
||||
'mb-2 border-b-[2px] border-b-[red] hover:cursor-pointer'
|
||||
}`}
|
||||
>
|
||||
{hoverOfTheFileComponent && showDownloadComponentOnLine === i ? (
|
||||
hoverOfTheFileComponent
|
||||
) : (
|
||||
<></>
|
||||
)}
|
||||
<div className="flex justify-between w-full items-center">
|
||||
<div>
|
||||
<li>{data.fileName}</li>
|
||||
{data.dateProps && data.dateProps.dateFormatter ? (
|
||||
<li>{data.dateProps.dateFormatter(data.dateProps.date)}</li>
|
||||
) : (
|
||||
<></>
|
||||
)}
|
||||
</div>
|
||||
</div>
|
||||
</ul>
|
||||
)
|
||||
)}
|
||||
{paginationConfig ? (
|
||||
<ul
|
||||
className={
|
||||
paginationConfig.containerClassName
|
||||
? paginationConfig.containerClassName
|
||||
: 'flex justify-end gap-x-[0.5rem] w-full'
|
||||
}
|
||||
style={paginationConfig.containerStyles ?? {}}
|
||||
>
|
||||
<li>
|
||||
<button
|
||||
className="hover:cursor-pointer hover:disabled:cursor-not-allowed"
|
||||
disabled={currentPage === 0}
|
||||
onClick={() => setCurrentPage(0)}
|
||||
>
|
||||
First
|
||||
</button>
|
||||
</li>
|
||||
<li>
|
||||
<button
|
||||
className="hover:cursor-pointer hover:disabled:cursor-not-allowed"
|
||||
onClick={() => setCurrentPage(currentPage - 1)}
|
||||
disabled={currentPage === 0}
|
||||
>
|
||||
Previous
|
||||
</button>
|
||||
</li>
|
||||
<label>{currentPage + 1}</label>
|
||||
|
||||
<li>
|
||||
<button
|
||||
onClick={() => setCurrentPage(currentPage + 1)}
|
||||
disabled={currentPage >= lastPage}
|
||||
className="hover:cursor-pointer hover:disabled:cursor-not-allowed"
|
||||
>
|
||||
Next
|
||||
</button>
|
||||
</li>
|
||||
|
||||
<li>
|
||||
<button
|
||||
onClick={() => setCurrentPage(lastPage)}
|
||||
disabled={currentPage >= lastPage}
|
||||
className="hover:cursor-pointer hover:disabled:cursor-not-allowed"
|
||||
>
|
||||
Last
|
||||
</button>
|
||||
</li>
|
||||
</ul>
|
||||
) : (
|
||||
<></>
|
||||
)}
|
||||
</>
|
||||
) : null;
|
||||
}
|
||||
@@ -7,12 +7,7 @@ export function Catalog({
|
||||
datasets,
|
||||
facets,
|
||||
}: {
|
||||
datasets: {
|
||||
_id: string | number;
|
||||
metadata: { title: string; [k: string]: string | number };
|
||||
url_path: string;
|
||||
[k: string]: any;
|
||||
}[];
|
||||
datasets: any[];
|
||||
facets: string[];
|
||||
}) {
|
||||
const [indexFilter, setIndexFilter] = useState('');
|
||||
@@ -61,7 +56,7 @@ export function Catalog({
|
||||
//Then check if the selectedValue for the given facet is included in the dataset metadata
|
||||
.filter((dataset) => {
|
||||
//Avoids a server rendering breakage
|
||||
if (!watch() || Object.keys(watch()).length === 0) return true;
|
||||
if (!watch() || Object.keys(watch()).length === 0) return true
|
||||
//This will filter only the key pairs of the metadata values that were selected as facets
|
||||
const datasetFacets = Object.entries(dataset.metadata).filter((entry) =>
|
||||
facets.includes(entry[0])
|
||||
@@ -91,7 +86,9 @@ export function Catalog({
|
||||
className="p-2 ml-1 text-sm shadow border border-block"
|
||||
{...register(elem[0] + '.selectedValue')}
|
||||
>
|
||||
<option value="">Filter by {elem[0]}</option>
|
||||
<option value="">
|
||||
Filter by {elem[0]}
|
||||
</option>
|
||||
{(elem[1] as { possibleValues: string[] }).possibleValues.map(
|
||||
(val) => (
|
||||
<option
|
||||
@@ -105,10 +102,10 @@ export function Catalog({
|
||||
)}
|
||||
</select>
|
||||
))}
|
||||
<ul className="mb-5 pl-6 mt-5 list-disc">
|
||||
<ul className='mb-5 pl-6 mt-5 list-disc'>
|
||||
{filteredDatasets.map((dataset) => (
|
||||
<li className="py-2" key={dataset._id}>
|
||||
<a className="font-medium underline" href={dataset.url_path}>
|
||||
<li className='py-2' key={dataset._id}>
|
||||
<a className='font-medium underline' href={dataset.url_path}>
|
||||
{dataset.metadata.title
|
||||
? dataset.metadata.title
|
||||
: dataset.url_path}
|
||||
@@ -119,3 +116,4 @@ export function Catalog({
|
||||
</>
|
||||
);
|
||||
}
|
||||
|
||||
|
||||
@@ -4,14 +4,12 @@ import { read, utils } from 'xlsx';
|
||||
import { AgGridReact } from 'ag-grid-react';
|
||||
import 'ag-grid-community/styles/ag-grid.css';
|
||||
import 'ag-grid-community/styles/ag-theme-alpine.css';
|
||||
import { Data } from '../types/properties';
|
||||
|
||||
export type ExcelProps = {
|
||||
data: Required<Pick<Data, 'url'>>;
|
||||
url: string;
|
||||
};
|
||||
|
||||
export function Excel({ data }: ExcelProps) {
|
||||
const url = data.url;
|
||||
export function Excel({ url }: ExcelProps) {
|
||||
const [isLoading, setIsLoading] = useState<boolean>(false);
|
||||
const [activeSheetName, setActiveSheetName] = useState<string>();
|
||||
const [workbook, setWorkbook] = useState<any>();
|
||||
|
||||
@@ -2,7 +2,6 @@ import { QueryClient, QueryClientProvider, useQuery } from 'react-query';
|
||||
import Papa from 'papaparse';
|
||||
import { Grid } from '@githubocto/flat-ui';
|
||||
import LoadingSpinner from './LoadingSpinner';
|
||||
import { Data } from '../types/properties';
|
||||
|
||||
const queryClient = new QueryClient();
|
||||
|
||||
@@ -37,25 +36,30 @@ export async function parseCsv(file: string, parsingConfig): Promise<any> {
|
||||
}
|
||||
|
||||
export interface FlatUiTableProps {
|
||||
data: Data;
|
||||
uniqueId?: number;
|
||||
url?: string;
|
||||
data?: { [key: string]: number | string }[];
|
||||
rawCsv?: string;
|
||||
randomId?: number;
|
||||
bytes: number;
|
||||
parsingConfig: any;
|
||||
}
|
||||
export const FlatUiTable: React.FC<FlatUiTableProps> = ({
|
||||
url,
|
||||
data,
|
||||
uniqueId,
|
||||
rawCsv,
|
||||
bytes = 5132288,
|
||||
parsingConfig = {},
|
||||
}) => {
|
||||
uniqueId = uniqueId ?? Math.random();
|
||||
const randomId = Math.random();
|
||||
return (
|
||||
// Provide the client to your App
|
||||
<QueryClientProvider client={queryClient}>
|
||||
<TableInner
|
||||
bytes={bytes}
|
||||
url={url}
|
||||
data={data}
|
||||
uniqueId={uniqueId}
|
||||
rawCsv={rawCsv}
|
||||
randomId={randomId}
|
||||
parsingConfig={parsingConfig}
|
||||
/>
|
||||
</QueryClientProvider>
|
||||
@@ -63,32 +67,33 @@ export const FlatUiTable: React.FC<FlatUiTableProps> = ({
|
||||
};
|
||||
|
||||
const TableInner: React.FC<FlatUiTableProps> = ({
|
||||
url,
|
||||
data,
|
||||
uniqueId,
|
||||
rawCsv,
|
||||
randomId,
|
||||
bytes,
|
||||
parsingConfig,
|
||||
}) => {
|
||||
const url = data.url;
|
||||
const csv = data.csv;
|
||||
const values = data.values;
|
||||
|
||||
if (values) {
|
||||
if (data) {
|
||||
return (
|
||||
<div className="w-full" style={{ height: '500px' }}>
|
||||
<Grid data={values} />
|
||||
<Grid data={data} />
|
||||
</div>
|
||||
);
|
||||
}
|
||||
const { data: csvString, isLoading: isDownloadingCSV } = useQuery(
|
||||
['dataCsv', url, uniqueId],
|
||||
['dataCsv', url, randomId],
|
||||
() => getCsv(url as string, bytes),
|
||||
{ enabled: !!url }
|
||||
);
|
||||
const { data: parsedData, isLoading: isParsing } = useQuery(
|
||||
['dataPreview', csvString, uniqueId],
|
||||
['dataPreview', csvString, randomId],
|
||||
() =>
|
||||
parseCsv(csv ? (csv as string) : (csvString as string), parsingConfig),
|
||||
{ enabled: csv ? true : !!csvString }
|
||||
parseCsv(
|
||||
rawCsv ? (rawCsv as string) : (csvString as string),
|
||||
parsingConfig
|
||||
),
|
||||
{ enabled: rawCsv ? true : !!csvString }
|
||||
);
|
||||
if (isParsing || isDownloadingCSV)
|
||||
<div className="w-full flex justify-center items-center h-[500px]">
|
||||
|
||||
@@ -1,17 +0,0 @@
|
||||
import { CSSProperties } from 'react';
|
||||
import { Data } from '../types/properties';
|
||||
|
||||
export interface IframeProps {
|
||||
data: Required<Pick<Data, 'url'>>;
|
||||
style?: CSSProperties;
|
||||
}
|
||||
|
||||
export function Iframe({ data, style }: IframeProps) {
|
||||
const url = data.url;
|
||||
return (
|
||||
<iframe
|
||||
src={url}
|
||||
style={style ?? { width: `100%`, height: `600px` }}
|
||||
></iframe>
|
||||
);
|
||||
}
|
||||
@@ -2,40 +2,35 @@ import { useEffect, useState } from 'react';
|
||||
import LoadingSpinner from './LoadingSpinner';
|
||||
import { VegaLite } from './VegaLite';
|
||||
import loadData from '../lib/loadData';
|
||||
import { Data } from '../types/properties';
|
||||
|
||||
type AxisType = 'quantitative' | 'temporal';
|
||||
type TimeUnit = 'year' | 'yearmonth' | undefined; // or ...
|
||||
type TimeUnit = 'year' | undefined; // or ...
|
||||
|
||||
export type LineChartProps = {
|
||||
data: Omit<Data, 'csv'>;
|
||||
data: Array<Array<string | number>> | string | { x: string; y: number }[];
|
||||
title?: string;
|
||||
xAxis: string;
|
||||
xAxis?: string;
|
||||
xAxisType?: AxisType;
|
||||
xAxisTimeUnit?: TimeUnit;
|
||||
yAxis: string | string[];
|
||||
xAxisTimeUnit: TimeUnit;
|
||||
yAxis?: string;
|
||||
yAxisType?: AxisType;
|
||||
fullWidth?: boolean;
|
||||
symbol?: string;
|
||||
};
|
||||
|
||||
export function LineChart({
|
||||
data,
|
||||
data = [],
|
||||
fullWidth = false,
|
||||
title = '',
|
||||
xAxis,
|
||||
xAxis = 'x',
|
||||
xAxisType = 'temporal',
|
||||
xAxisTimeUnit = 'year', // TODO: defaults to undefined would probably work better... keeping it as it's for compatibility purposes
|
||||
yAxis,
|
||||
yAxis = 'y',
|
||||
yAxisType = 'quantitative',
|
||||
symbol,
|
||||
}: LineChartProps) {
|
||||
const url = data.url;
|
||||
const values = data.values;
|
||||
const [isLoading, setIsLoading] = useState<boolean>(false);
|
||||
|
||||
// By default, assumes data is an Array...
|
||||
const [specData, setSpecData] = useState<any>({ name: 'table' });
|
||||
const isMultiYAxis = Array.isArray(yAxis);
|
||||
|
||||
const spec = {
|
||||
$schema: 'https://vega.github.io/schema/vega-lite/v5.json',
|
||||
@@ -49,11 +44,6 @@ export function LineChart({
|
||||
tooltip: true,
|
||||
},
|
||||
data: specData,
|
||||
...(isMultiYAxis
|
||||
? {
|
||||
transform: [{ fold: yAxis, as: ['key', 'value'] }],
|
||||
}
|
||||
: {}),
|
||||
selection: {
|
||||
grid: {
|
||||
type: 'interval',
|
||||
@@ -67,35 +57,20 @@ export function LineChart({
|
||||
type: xAxisType,
|
||||
},
|
||||
y: {
|
||||
field: isMultiYAxis ? 'value' : yAxis,
|
||||
field: yAxis,
|
||||
type: yAxisType,
|
||||
},
|
||||
...(symbol
|
||||
? {
|
||||
color: {
|
||||
field: symbol,
|
||||
type: 'nominal',
|
||||
},
|
||||
}
|
||||
: {}),
|
||||
...(isMultiYAxis
|
||||
? {
|
||||
color: {
|
||||
field: 'key',
|
||||
type: 'nominal',
|
||||
},
|
||||
}
|
||||
: {}),
|
||||
},
|
||||
} as any;
|
||||
|
||||
useEffect(() => {
|
||||
if (url) {
|
||||
// If data is string, assume it's a URL
|
||||
if (typeof data === 'string') {
|
||||
setIsLoading(true);
|
||||
|
||||
// Manualy loading the data allows us to do other kinds
|
||||
// of stuff later e.g. load a file partially
|
||||
loadData(url).then((res: any) => {
|
||||
loadData(data).then((res: any) => {
|
||||
setSpecData({ values: res, format: { type: 'csv' } });
|
||||
setIsLoading(false);
|
||||
});
|
||||
@@ -103,8 +78,12 @@ export function LineChart({
|
||||
}, []);
|
||||
|
||||
var vegaData = {};
|
||||
if (values) {
|
||||
vegaData = { table: values };
|
||||
if (Array.isArray(data)) {
|
||||
var dataObj;
|
||||
dataObj = data.map((r) => {
|
||||
return { x: r[0], y: r[1] };
|
||||
});
|
||||
vegaData = { table: dataObj };
|
||||
}
|
||||
|
||||
return isLoading ? (
|
||||
@@ -112,6 +91,6 @@ export function LineChart({
|
||||
<LoadingSpinner />
|
||||
</div>
|
||||
) : (
|
||||
<VegaLite data={vegaData} spec={spec} />
|
||||
<VegaLite fullWidth={fullWidth} data={vegaData} spec={spec} />
|
||||
);
|
||||
}
|
||||
|
||||
@@ -1,8 +1,7 @@
|
||||
import { CSSProperties, useEffect, useState } from 'react';
|
||||
import { useEffect, useState } from 'react';
|
||||
import LoadingSpinner from './LoadingSpinner';
|
||||
import loadData from '../lib/loadData';
|
||||
import chroma from 'chroma-js';
|
||||
import { GeospatialData } from '../types/properties';
|
||||
import {
|
||||
MapContainer,
|
||||
TileLayer,
|
||||
@@ -12,34 +11,10 @@ import {
|
||||
|
||||
import 'leaflet/dist/leaflet.css';
|
||||
import * as L from 'leaflet';
|
||||
import providers from '../lib/tileLayerPresets';
|
||||
|
||||
type VariantKeys<T> = T extends { variants: infer V }
|
||||
? {
|
||||
[K in keyof V]: K extends string
|
||||
? `${K}` | `${K}.${VariantKeys<V[K]>}`
|
||||
: never;
|
||||
}[keyof V]
|
||||
: never;
|
||||
|
||||
type ProviderVariantKeys<T> = {
|
||||
[K in keyof T]: K extends string
|
||||
? `${K}` | `${K}.${VariantKeys<T[K]>}`
|
||||
: never;
|
||||
}[keyof T];
|
||||
|
||||
type TileLayerPreset = ProviderVariantKeys<typeof providers> | 'custom';
|
||||
|
||||
interface TileLayerSettings extends L.TileLayerOptions {
|
||||
url?: string;
|
||||
variant?: string | any;
|
||||
}
|
||||
|
||||
export type MapProps = {
|
||||
tileLayerName?: TileLayerPreset;
|
||||
tileLayerOptions?: TileLayerSettings | undefined;
|
||||
layers: {
|
||||
data: GeospatialData;
|
||||
data: string | GeoJSON.GeoJSON;
|
||||
name: string;
|
||||
colorScale?: {
|
||||
starting: string;
|
||||
@@ -50,29 +25,14 @@ export type MapProps = {
|
||||
propNames: string[];
|
||||
}
|
||||
| boolean;
|
||||
_id?: number;
|
||||
}[];
|
||||
title?: string;
|
||||
center?: { latitude: number | undefined; longitude: number | undefined };
|
||||
zoom?: number;
|
||||
style?: CSSProperties;
|
||||
autoZoomConfiguration?: {
|
||||
layerName: string;
|
||||
};
|
||||
};
|
||||
|
||||
const tileLayerDefaultName = process?.env
|
||||
.NEXT_PUBLIC_MAP_TILE_LAYER_NAME as TileLayerPreset;
|
||||
|
||||
const tileLayerDefaultOptions = Object.keys(process?.env)
|
||||
.filter((key) => key.startsWith('NEXT_PUBLIC_MAP_TILE_LAYER_OPTION_'))
|
||||
.reduce((obj, key) => {
|
||||
obj[key.split('NEXT_PUBLIC_MAP_TILE_LAYER_OPTION_')[1]] = process.env[key];
|
||||
return obj;
|
||||
}, {}) as TileLayerSettings;
|
||||
|
||||
export function Map({
|
||||
tileLayerName = tileLayerDefaultName || 'OpenStreetMap',
|
||||
tileLayerOptions,
|
||||
layers = [
|
||||
{
|
||||
data: null,
|
||||
@@ -84,116 +44,23 @@ export function Map({
|
||||
center = { latitude: 45, longitude: 45 },
|
||||
zoom = 2,
|
||||
title = '',
|
||||
style = {},
|
||||
autoZoomConfiguration = undefined,
|
||||
}: MapProps) {
|
||||
const [isLoading, setIsLoading] = useState<boolean>(false);
|
||||
const [layersData, setLayersData] = useState<any>([]);
|
||||
|
||||
/*
|
||||
tileLayerDefaultOptions
|
||||
extract all environment variables thats starts with NEXT_PUBLIC_MAP_TILE_LAYER_OPTION_.
|
||||
the variables names are the same as the TileLayer object properties:
|
||||
- NEXT_PUBLIC_MAP_TILE_LAYER_OPTION_url:
|
||||
- NEXT_PUBLIC_MAP_TILE_LAYER_OPTION_attribution
|
||||
- NEXT_PUBLIC_MAP_TILE_LAYER_OPTION_accessToken
|
||||
- NEXT_PUBLIC_MAP_TILE_LAYER_OPTION_id
|
||||
- NEXT_PUBLIC_MAP_TILE_LAYER_OPTION_ext
|
||||
- NEXT_PUBLIC_MAP_TILE_LAYER_OPTION_bounds
|
||||
- NEXT_PUBLIC_MAP_TILE_LAYER_OPTION_maxZoom
|
||||
- NEXT_PUBLIC_MAP_TILE_LAYER_OPTION_minZoom
|
||||
see TileLayerOptions inteface
|
||||
*/
|
||||
|
||||
//tileLayerData prioritizes properties passed through component over those passed through .env variables
|
||||
tileLayerOptions = Object.assign(tileLayerDefaultOptions, tileLayerOptions);
|
||||
|
||||
let provider = {
|
||||
url: tileLayerOptions.url,
|
||||
options: tileLayerOptions,
|
||||
};
|
||||
|
||||
if (tileLayerName != 'custom') {
|
||||
var parts = tileLayerName.split('.');
|
||||
var providerName = parts[0];
|
||||
var variantName: string = parts[1];
|
||||
|
||||
//make sure to declare a variant if url depends on a variant: assume first
|
||||
if (providers[providerName].url?.includes('{variant}') && !variantName)
|
||||
variantName = Object.keys(providers[providerName].variants)[0];
|
||||
|
||||
if (!providers[providerName]) {
|
||||
throw 'No such provider (' + providerName + ')';
|
||||
}
|
||||
|
||||
provider = {
|
||||
url: providers[providerName].url,
|
||||
options: providers[providerName].options,
|
||||
};
|
||||
|
||||
// overwrite values in provider from variant.
|
||||
if (variantName && 'variants' in providers[providerName]) {
|
||||
if (!(variantName in providers[providerName].variants)) {
|
||||
throw 'No such variant of ' + providerName + ' (' + variantName + ')';
|
||||
}
|
||||
var variant = providers[providerName].variants[variantName];
|
||||
var variantOptions;
|
||||
if (typeof variant === 'string') {
|
||||
variantOptions = {
|
||||
variant: variant,
|
||||
};
|
||||
} else {
|
||||
variantOptions = variant.options;
|
||||
}
|
||||
provider = {
|
||||
url: variant.url || provider.url,
|
||||
options: L.Util.extend({}, provider.options, variantOptions),
|
||||
};
|
||||
}
|
||||
|
||||
var attributionReplacer = function (attr) {
|
||||
if (attr.indexOf('{attribution.') === -1) {
|
||||
return attr;
|
||||
}
|
||||
return attr.replace(
|
||||
/\{attribution.(\w*)\}/g,
|
||||
function (match: any, attributionName: string) {
|
||||
match;
|
||||
return attributionReplacer(
|
||||
providers[attributionName].options.attribution
|
||||
);
|
||||
}
|
||||
);
|
||||
};
|
||||
|
||||
provider.options.attribution = attributionReplacer(
|
||||
provider.options.attribution
|
||||
);
|
||||
}
|
||||
|
||||
var tileLayerData = L.Util.extend(
|
||||
{
|
||||
url: provider.url,
|
||||
},
|
||||
provider.options,
|
||||
tileLayerOptions
|
||||
);
|
||||
|
||||
useEffect(() => {
|
||||
const loadDataPromises = layers.map(async (layer) => {
|
||||
const url = layer.data.url;
|
||||
const geojson = layer.data.geojson;
|
||||
let layerData: any;
|
||||
|
||||
if (url) {
|
||||
if (typeof layer.data === 'string') {
|
||||
// If "data" is string, assume it's a URL
|
||||
setIsLoading(true);
|
||||
layerData = await loadData(url).then((res: any) => {
|
||||
layerData = await loadData(layer.data).then((res: any) => {
|
||||
return JSON.parse(res);
|
||||
});
|
||||
} else {
|
||||
// Else, expect raw GeoJSON
|
||||
layerData = geojson;
|
||||
layerData = layer.data;
|
||||
}
|
||||
|
||||
if (layer.colorScale) {
|
||||
@@ -225,12 +92,10 @@ export function Map({
|
||||
</div>
|
||||
) : (
|
||||
<MapContainer
|
||||
key={layersData}
|
||||
center={[center.latitude, center.longitude]}
|
||||
zoom={zoom}
|
||||
scrollWheelZoom={false}
|
||||
className="h-80 w-full"
|
||||
style={style ?? {}}
|
||||
// @ts-ignore
|
||||
whenReady={(map: any) => {
|
||||
// Enable zoom using scroll wheel
|
||||
@@ -250,28 +115,12 @@ export function Map({
|
||||
};
|
||||
|
||||
if (title) info.addTo(map.target);
|
||||
if (!autoZoomConfiguration) return;
|
||||
|
||||
let layerToZoomBounds = L.latLngBounds(L.latLng(0, 0), L.latLng(0, 0));
|
||||
|
||||
layers.forEach((layer) => {
|
||||
if (layer.name === autoZoomConfiguration.layerName) {
|
||||
const data = layersData.find(
|
||||
(layerData) => layerData.name === layer.name
|
||||
)?.data;
|
||||
|
||||
if (data) {
|
||||
layerToZoomBounds = L.geoJSON(data).getBounds();
|
||||
return;
|
||||
}
|
||||
}
|
||||
});
|
||||
|
||||
map.target.fitBounds(layerToZoomBounds);
|
||||
}}
|
||||
>
|
||||
{tileLayerData.url && <TileLayer {...tileLayerData} />}
|
||||
|
||||
<TileLayer
|
||||
attribution='© <a href="https://www.openstreetmap.org/copyright">OpenStreetMap</a> contributors'
|
||||
url="https://{s}.tile.openstreetmap.org/{z}/{x}/{y}.png"
|
||||
/>
|
||||
<LayersControl position="bottomright">
|
||||
{layers.map((layer) => {
|
||||
const data = layersData.find(
|
||||
|
||||
@@ -1,24 +1,22 @@
|
||||
// Core viewer
|
||||
import { Viewer, Worker, SpecialZoomLevel } from '@react-pdf-viewer/core';
|
||||
import { defaultLayoutPlugin } from '@react-pdf-viewer/default-layout';
|
||||
import { Data } from '../types/properties';
|
||||
|
||||
// Import styles
|
||||
import '@react-pdf-viewer/core/lib/styles/index.css';
|
||||
import '@react-pdf-viewer/default-layout/lib/styles/index.css';
|
||||
|
||||
export interface PdfViewerProps {
|
||||
data: Required<Pick<Data, 'url'>>;
|
||||
url: string;
|
||||
layout: boolean;
|
||||
parentClassName?: string;
|
||||
}
|
||||
|
||||
export function PdfViewer({
|
||||
data,
|
||||
url,
|
||||
layout = false,
|
||||
parentClassName = 'h-screen',
|
||||
parentClassName,
|
||||
}: PdfViewerProps) {
|
||||
const url = data.url;
|
||||
const defaultLayoutPluginInstance = defaultLayoutPlugin();
|
||||
return (
|
||||
<Worker workerUrl="https://unpkg.com/pdfjs-dist@2.15.349/build/pdf.worker.js">
|
||||
|
||||
@@ -1,9 +0,0 @@
|
||||
import Plot, { PlotParams } from "react-plotly.js";
|
||||
|
||||
export const Plotly: React.FC<PlotParams> = (props) => {
|
||||
return (
|
||||
<div>
|
||||
<Plot {...props} />
|
||||
</div>
|
||||
);
|
||||
};
|
||||
@@ -1,153 +0,0 @@
|
||||
import { QueryClient, QueryClientProvider, useQuery } from 'react-query';
|
||||
import { Plotly } from './Plotly';
|
||||
import Papa, { ParseConfig } from 'papaparse';
|
||||
import LoadingSpinner from './LoadingSpinner';
|
||||
import { Data } from '../types/properties';
|
||||
|
||||
const queryClient = new QueryClient();
|
||||
|
||||
async function getCsv(url: string, bytes: number) {
|
||||
const response = await fetch(url, {
|
||||
headers: {
|
||||
Range: `bytes=0-${bytes}`,
|
||||
},
|
||||
});
|
||||
const data = await response.text();
|
||||
return data;
|
||||
}
|
||||
|
||||
async function parseCsv(
|
||||
file: string,
|
||||
parsingConfig: ParseConfig
|
||||
): Promise<any> {
|
||||
return new Promise((resolve, reject) => {
|
||||
Papa.parse(file, {
|
||||
...parsingConfig,
|
||||
header: true,
|
||||
dynamicTyping: true,
|
||||
skipEmptyLines: true,
|
||||
transform: (value: string): string => {
|
||||
return value.trim();
|
||||
},
|
||||
complete: (results: any) => {
|
||||
return resolve(results);
|
||||
},
|
||||
error: (error: any) => {
|
||||
return reject(error);
|
||||
},
|
||||
});
|
||||
});
|
||||
}
|
||||
|
||||
export interface PlotlyBarChartProps {
|
||||
data: Data;
|
||||
uniqueId?: number;
|
||||
bytes?: number;
|
||||
parsingConfig?: ParseConfig;
|
||||
xAxis: string;
|
||||
yAxis: string;
|
||||
// TODO: commented out because this doesn't work. I believe
|
||||
// this would only make any difference on charts with multiple
|
||||
// traces.
|
||||
// lineLabel?: string;
|
||||
title?: string;
|
||||
}
|
||||
|
||||
export const PlotlyBarChart: React.FC<PlotlyBarChartProps> = ({
|
||||
data,
|
||||
bytes = 5132288,
|
||||
parsingConfig = {},
|
||||
xAxis,
|
||||
yAxis,
|
||||
// lineLabel,
|
||||
title = '',
|
||||
}) => {
|
||||
const uniqueId = Math.random();
|
||||
return (
|
||||
// Provide the client to your App
|
||||
<QueryClientProvider client={queryClient}>
|
||||
<PlotlyBarChartInner
|
||||
data={data}
|
||||
uniqueId={uniqueId}
|
||||
bytes={bytes}
|
||||
parsingConfig={parsingConfig}
|
||||
xAxis={xAxis}
|
||||
yAxis={yAxis}
|
||||
// lineLabel={lineLabel ?? yAxis}
|
||||
title={title}
|
||||
/>
|
||||
</QueryClientProvider>
|
||||
);
|
||||
};
|
||||
|
||||
const PlotlyBarChartInner: React.FC<PlotlyBarChartProps> = ({
|
||||
data,
|
||||
uniqueId,
|
||||
bytes,
|
||||
parsingConfig,
|
||||
xAxis,
|
||||
yAxis,
|
||||
// lineLabel,
|
||||
title,
|
||||
}) => {
|
||||
if (data.values) {
|
||||
return (
|
||||
<div className="w-full" style={{ height: '500px' }}>
|
||||
<Plotly
|
||||
layout={{
|
||||
title,
|
||||
}}
|
||||
data={[
|
||||
{
|
||||
x: data.values.map((d) => d[xAxis]),
|
||||
y: data.values.map((d) => d[yAxis]),
|
||||
type: 'bar',
|
||||
// name: lineLabel,
|
||||
},
|
||||
]}
|
||||
/>
|
||||
</div>
|
||||
);
|
||||
}
|
||||
const { data: csvString, isLoading: isDownloadingCSV } = useQuery(
|
||||
['dataCsv', data.url, uniqueId],
|
||||
() => getCsv(data.url as string, bytes ?? 5132288),
|
||||
{ enabled: !!data.url }
|
||||
);
|
||||
const { data: parsedData, isLoading: isParsing } = useQuery(
|
||||
['dataPreview', csvString, uniqueId],
|
||||
() =>
|
||||
parseCsv(
|
||||
data.csv ? (data.csv as string) : (csvString as string),
|
||||
parsingConfig ?? {}
|
||||
),
|
||||
{ enabled: data.csv ? true : !!csvString }
|
||||
);
|
||||
if (isParsing || isDownloadingCSV)
|
||||
<div className="w-full flex justify-center items-center h-[500px]">
|
||||
<LoadingSpinner />
|
||||
</div>;
|
||||
if (parsedData)
|
||||
return (
|
||||
<div className="w-full" style={{ height: '500px' }}>
|
||||
<Plotly
|
||||
layout={{
|
||||
title,
|
||||
}}
|
||||
data={[
|
||||
{
|
||||
x: parsedData.data.map((d: any) => d[xAxis]),
|
||||
y: parsedData.data.map((d: any) => d[yAxis]),
|
||||
type: 'bar',
|
||||
// name: lineLabel, TODO: commented out because this doesn't work
|
||||
},
|
||||
]}
|
||||
/>
|
||||
</div>
|
||||
);
|
||||
return (
|
||||
<div className="w-full flex justify-center items-center h-[500px]">
|
||||
<LoadingSpinner />
|
||||
</div>
|
||||
);
|
||||
};
|
||||
@@ -1,155 +0,0 @@
|
||||
import { QueryClient, QueryClientProvider, useQuery } from 'react-query';
|
||||
import { Plotly } from './Plotly';
|
||||
import Papa, { ParseConfig } from 'papaparse';
|
||||
import LoadingSpinner from './LoadingSpinner';
|
||||
import { Data } from '../types/properties';
|
||||
|
||||
const queryClient = new QueryClient();
|
||||
|
||||
async function getCsv(url: string, bytes: number) {
|
||||
const response = await fetch(url, {
|
||||
headers: {
|
||||
Range: `bytes=0-${bytes}`,
|
||||
},
|
||||
});
|
||||
const data = await response.text();
|
||||
return data;
|
||||
}
|
||||
|
||||
async function parseCsv(
|
||||
file: string,
|
||||
parsingConfig: ParseConfig
|
||||
): Promise<any> {
|
||||
return new Promise((resolve, reject) => {
|
||||
Papa.parse(file, {
|
||||
...parsingConfig,
|
||||
header: true,
|
||||
dynamicTyping: true,
|
||||
skipEmptyLines: true,
|
||||
transform: (value: string): string => {
|
||||
return value.trim();
|
||||
},
|
||||
complete: (results: any) => {
|
||||
return resolve(results);
|
||||
},
|
||||
error: (error: any) => {
|
||||
return reject(error);
|
||||
},
|
||||
});
|
||||
});
|
||||
}
|
||||
|
||||
export interface PlotlyLineChartProps {
|
||||
data: Data;
|
||||
bytes?: number;
|
||||
parsingConfig?: ParseConfig;
|
||||
xAxis: string;
|
||||
yAxis: string;
|
||||
lineLabel?: string;
|
||||
title?: string;
|
||||
uniqueId?: number;
|
||||
}
|
||||
|
||||
export const PlotlyLineChart: React.FC<PlotlyLineChartProps> = ({
|
||||
data,
|
||||
bytes = 5132288,
|
||||
parsingConfig = {},
|
||||
xAxis,
|
||||
yAxis,
|
||||
lineLabel,
|
||||
title = '',
|
||||
uniqueId,
|
||||
}) => {
|
||||
uniqueId = uniqueId ?? Math.random();
|
||||
return (
|
||||
// Provide the client to your App
|
||||
<QueryClientProvider client={queryClient}>
|
||||
<LineChartInner
|
||||
data={data}
|
||||
uniqueId={uniqueId}
|
||||
bytes={bytes}
|
||||
parsingConfig={parsingConfig}
|
||||
xAxis={xAxis}
|
||||
yAxis={yAxis}
|
||||
lineLabel={lineLabel ?? yAxis}
|
||||
title={title}
|
||||
/>
|
||||
</QueryClientProvider>
|
||||
);
|
||||
};
|
||||
|
||||
const LineChartInner: React.FC<PlotlyLineChartProps> = ({
|
||||
data,
|
||||
uniqueId,
|
||||
bytes,
|
||||
parsingConfig,
|
||||
xAxis,
|
||||
yAxis,
|
||||
lineLabel,
|
||||
title,
|
||||
}) => {
|
||||
const values = data.values;
|
||||
const url = data.url;
|
||||
const csv = data.csv;
|
||||
|
||||
if (values) {
|
||||
return (
|
||||
<div className="w-full" style={{ height: '500px' }}>
|
||||
<Plotly
|
||||
layout={{
|
||||
title,
|
||||
}}
|
||||
data={[
|
||||
{
|
||||
x: values.map((d) => d[xAxis]),
|
||||
y: values.map((d) => d[yAxis]),
|
||||
mode: 'lines',
|
||||
name: lineLabel,
|
||||
},
|
||||
]}
|
||||
/>
|
||||
</div>
|
||||
);
|
||||
}
|
||||
const { data: csvString, isLoading: isDownloadingCSV } = useQuery(
|
||||
['dataCsv', url, uniqueId],
|
||||
() => getCsv(url as string, bytes ?? 5132288),
|
||||
{ enabled: !!url }
|
||||
);
|
||||
const { data: parsedData, isLoading: isParsing } = useQuery(
|
||||
['dataPreview', csvString, uniqueId],
|
||||
() =>
|
||||
parseCsv(
|
||||
csv ? (csv as string) : (csvString as string),
|
||||
parsingConfig ?? {}
|
||||
),
|
||||
{ enabled: csv ? true : !!csvString }
|
||||
);
|
||||
if (isParsing || isDownloadingCSV)
|
||||
<div className="w-full flex justify-center items-center h-[500px]">
|
||||
<LoadingSpinner />
|
||||
</div>;
|
||||
if (parsedData)
|
||||
return (
|
||||
<div className="w-full" style={{ height: '500px' }}>
|
||||
<Plotly
|
||||
layout={{
|
||||
title,
|
||||
}}
|
||||
data={[
|
||||
{
|
||||
x: parsedData.data.map((d: any) => d[xAxis]),
|
||||
y: parsedData.data.map((d: any) => d[yAxis]),
|
||||
mode: 'lines',
|
||||
name: lineLabel,
|
||||
},
|
||||
]}
|
||||
/>
|
||||
</div>
|
||||
);
|
||||
return (
|
||||
<div className="w-full flex justify-center items-center h-[500px]">
|
||||
<LoadingSpinner />
|
||||
</div>
|
||||
);
|
||||
};
|
||||
@@ -6,8 +6,6 @@ import {
|
||||
getFilteredRowModel,
|
||||
getPaginationRowModel,
|
||||
getSortedRowModel,
|
||||
PaginationState,
|
||||
Table as ReactTable,
|
||||
useReactTable,
|
||||
} from '@tanstack/react-table';
|
||||
|
||||
@@ -27,19 +25,12 @@ import DebouncedInput from './DebouncedInput';
|
||||
import loadData from '../lib/loadData';
|
||||
import LoadingSpinner from './LoadingSpinner';
|
||||
|
||||
export type TableData = { cols: {key: string, name: string}[]; data: any[]; total: number };
|
||||
|
||||
export type TableProps = {
|
||||
data?: Array<{ [key: string]: number | string }>;
|
||||
cols?: Array<{ [key: string]: string }>;
|
||||
csv?: string;
|
||||
url?: string;
|
||||
fullWidth?: boolean;
|
||||
datastoreConfig?: {
|
||||
dataStoreURI: string;
|
||||
rowsPerPage?: number;
|
||||
dataMapperFn: (data) => Promise<TableData> | TableData;
|
||||
};
|
||||
};
|
||||
|
||||
export const Table = ({
|
||||
@@ -48,28 +39,8 @@ export const Table = ({
|
||||
csv = '',
|
||||
url = '',
|
||||
fullWidth = false,
|
||||
datastoreConfig,
|
||||
}: TableProps) => {
|
||||
const [isLoading, setIsLoading] = useState<boolean>(false);
|
||||
const [pageMap, setPageMap] = useState(new Map<number, boolean>());
|
||||
const {
|
||||
dataMapperFn,
|
||||
dataStoreURI,
|
||||
rowsPerPage = 10,
|
||||
} = datastoreConfig ?? {};
|
||||
|
||||
const [globalFilter, setGlobalFilter] = useState('');
|
||||
const [isLoadingPage, setIsLoadingPage] = useState<boolean>(false);
|
||||
const [totalOfRows, setTotalOfRows] = useState<number>(0);
|
||||
|
||||
const [{ pageIndex, pageSize }, setPagination] = useState<PaginationState>({
|
||||
pageIndex: 0,
|
||||
pageSize: rowsPerPage,
|
||||
});
|
||||
|
||||
const [lastIndex, setLastIndex] = useState(pageSize);
|
||||
const [startIndex, setStartIndex] = useState(0);
|
||||
const [hasSorted, setHasSorted] = useState(false);
|
||||
|
||||
if (csv) {
|
||||
const out = parseCsv(csv);
|
||||
@@ -91,56 +62,21 @@ export const Table = ({
|
||||
);
|
||||
}, [data, cols]);
|
||||
|
||||
let table: ReactTable<unknown>;
|
||||
const [globalFilter, setGlobalFilter] = useState('');
|
||||
|
||||
if (datastoreConfig) {
|
||||
useEffect(() => {
|
||||
setIsLoading(true);
|
||||
fetch(`${dataStoreURI}&limit=${rowsPerPage}&offset=0`)
|
||||
.then((res) => res.json())
|
||||
.then(async (res) => {
|
||||
const { data, cols, total } = await dataMapperFn(res);
|
||||
setData(data);
|
||||
setCols(cols);
|
||||
setTotalOfRows(Math.ceil(total / rowsPerPage));
|
||||
pageMap.set(0, true);
|
||||
})
|
||||
.finally(() => setIsLoading(false));
|
||||
}, [dataStoreURI]);
|
||||
|
||||
table = useReactTable({
|
||||
data,
|
||||
pageCount: totalOfRows,
|
||||
columns: tableCols,
|
||||
getCoreRowModel: getCoreRowModel(),
|
||||
state: {
|
||||
pagination: { pageIndex, pageSize },
|
||||
},
|
||||
getFilteredRowModel: getFilteredRowModel(),
|
||||
manualPagination: true,
|
||||
onPaginationChange: setPagination,
|
||||
getSortedRowModel: getSortedRowModel(),
|
||||
});
|
||||
|
||||
useEffect(() => {
|
||||
if (!hasSorted) return;
|
||||
queryDataByText(globalFilter);
|
||||
}, [table.getState().sorting]);
|
||||
} else {
|
||||
table = useReactTable({
|
||||
data,
|
||||
columns: tableCols,
|
||||
getCoreRowModel: getCoreRowModel(),
|
||||
state: {
|
||||
globalFilter,
|
||||
},
|
||||
globalFilterFn: globalFilterFn,
|
||||
onGlobalFilterChange: setGlobalFilter,
|
||||
getFilteredRowModel: getFilteredRowModel(),
|
||||
getPaginationRowModel: getPaginationRowModel(),
|
||||
getSortedRowModel: getSortedRowModel(),
|
||||
});
|
||||
}
|
||||
const table = useReactTable({
|
||||
data,
|
||||
columns: tableCols,
|
||||
getCoreRowModel: getCoreRowModel(),
|
||||
state: {
|
||||
globalFilter,
|
||||
},
|
||||
globalFilterFn: globalFilterFn,
|
||||
onGlobalFilterChange: setGlobalFilter,
|
||||
getFilteredRowModel: getFilteredRowModel(),
|
||||
getPaginationRowModel: getPaginationRowModel(),
|
||||
getSortedRowModel: getSortedRowModel(),
|
||||
});
|
||||
|
||||
useEffect(() => {
|
||||
if (url) {
|
||||
@@ -155,70 +91,6 @@ export const Table = ({
|
||||
}
|
||||
}, [url]);
|
||||
|
||||
const queryDataByText = (filter) => {
|
||||
setIsLoadingPage(true);
|
||||
const sortedParam = getSortParam();
|
||||
fetch(
|
||||
`${dataStoreURI}&limit=${rowsPerPage}&offset=0&q=${filter}${sortedParam}`
|
||||
)
|
||||
.then((res) => res.json())
|
||||
.then(async (res) => {
|
||||
const { data, total = 0 } = await dataMapperFn(res);
|
||||
setTotalOfRows(Math.ceil(total / rowsPerPage));
|
||||
setData(data);
|
||||
const newMap = new Map();
|
||||
newMap.set(0, true);
|
||||
setPageMap(newMap);
|
||||
table.setPageIndex(0);
|
||||
setStartIndex(0);
|
||||
setLastIndex(pageSize);
|
||||
})
|
||||
.finally(() => setIsLoadingPage(false));
|
||||
};
|
||||
|
||||
const getSortParam = () => {
|
||||
const sort = table.getState().sorting;
|
||||
return sort.length == 0
|
||||
? ``
|
||||
: '&sort=' +
|
||||
sort
|
||||
.map(
|
||||
(x, i) =>
|
||||
`${x.id}${
|
||||
i === sort.length - 1 ? (x.desc ? ` desc` : ` asc`) : `,`
|
||||
}`
|
||||
)
|
||||
.reduce((x1, x2) => x1 + x2);
|
||||
};
|
||||
|
||||
const queryPaginatedData = (newPageIndex) => {
|
||||
let newStartIndex = newPageIndex * pageSize;
|
||||
setStartIndex(newStartIndex);
|
||||
setLastIndex(newStartIndex + pageSize);
|
||||
|
||||
if (!pageMap.get(newPageIndex)) pageMap.set(newPageIndex, true);
|
||||
else return;
|
||||
|
||||
const sortedParam = getSortParam();
|
||||
|
||||
setIsLoadingPage(true);
|
||||
fetch(
|
||||
`${dataStoreURI}&limit=${rowsPerPage}&offset=${
|
||||
newStartIndex + pageSize
|
||||
}&q=${globalFilter}${sortedParam}`
|
||||
)
|
||||
.then((res) => res.json())
|
||||
.then(async (res) => {
|
||||
const { data: responseData } = await dataMapperFn(res);
|
||||
responseData.forEach((e) => {
|
||||
data[newStartIndex] = e;
|
||||
newStartIndex++;
|
||||
});
|
||||
setData([...data]);
|
||||
})
|
||||
.finally(() => setIsLoadingPage(false));
|
||||
};
|
||||
|
||||
return isLoading ? (
|
||||
<div className="w-full h-full min-h-[500px] flex items-center justify-center">
|
||||
<LoadingSpinner />
|
||||
@@ -227,10 +99,7 @@ export const Table = ({
|
||||
<div className={`${fullWidth ? 'w-[90vw] ml-[calc(50%-45vw)]' : 'w-full'}`}>
|
||||
<DebouncedInput
|
||||
value={globalFilter ?? ''}
|
||||
onChange={(value: any) => {
|
||||
if (datastoreConfig) queryDataByText(String(value));
|
||||
setGlobalFilter(String(value));
|
||||
}}
|
||||
onChange={(value: any) => setGlobalFilter(String(value))}
|
||||
className="p-2 text-sm shadow border border-block"
|
||||
placeholder="Search all columns..."
|
||||
/>
|
||||
@@ -245,10 +114,7 @@ export const Table = ({
|
||||
className: h.column.getCanSort()
|
||||
? 'cursor-pointer select-none'
|
||||
: '',
|
||||
onClick: (v) => {
|
||||
setHasSorted(true);
|
||||
h.column.getToggleSortingHandler()(v);
|
||||
},
|
||||
onClick: h.column.getToggleSortingHandler(),
|
||||
}}
|
||||
>
|
||||
{flexRender(h.column.columnDef.header, h.getContext())}
|
||||
@@ -269,28 +135,15 @@ export const Table = ({
|
||||
))}
|
||||
</thead>
|
||||
<tbody>
|
||||
{datastoreConfig && isLoadingPage ? (
|
||||
<tr>
|
||||
<td colSpan={cols.length} rowSpan={cols.length}>
|
||||
<div className="w-full h-full flex items-center justify-center pt-6">
|
||||
<LoadingSpinner />
|
||||
</div>
|
||||
</td>
|
||||
{table.getRowModel().rows.map((r) => (
|
||||
<tr key={r.id} className="border-b border-b-slate-200">
|
||||
{r.getVisibleCells().map((c) => (
|
||||
<td key={c.id} className="py-2">
|
||||
{flexRender(c.column.columnDef.cell, c.getContext())}
|
||||
</td>
|
||||
))}
|
||||
</tr>
|
||||
) : (
|
||||
(datastoreConfig
|
||||
? table.getRowModel().rows.slice(startIndex, lastIndex)
|
||||
: table.getRowModel().rows
|
||||
).map((r) => (
|
||||
<tr key={r.id} className="border-b border-b-slate-200">
|
||||
{r.getVisibleCells().map((c) => (
|
||||
<td key={c.id} className="py-2">
|
||||
{flexRender(c.column.columnDef.cell, c.getContext())}
|
||||
</td>
|
||||
))}
|
||||
</tr>
|
||||
))
|
||||
)}
|
||||
))}
|
||||
</tbody>
|
||||
</table>
|
||||
<div className="flex gap-2 items-center justify-center mt-10">
|
||||
@@ -298,10 +151,7 @@ export const Table = ({
|
||||
className={`w-6 h-6 ${
|
||||
!table.getCanPreviousPage() ? 'opacity-25' : 'opacity-100'
|
||||
}`}
|
||||
onClick={() => {
|
||||
if (datastoreConfig) queryPaginatedData(0);
|
||||
table.setPageIndex(0);
|
||||
}}
|
||||
onClick={() => table.setPageIndex(0)}
|
||||
disabled={!table.getCanPreviousPage()}
|
||||
>
|
||||
<ChevronDoubleLeftIcon />
|
||||
@@ -310,12 +160,7 @@ export const Table = ({
|
||||
className={`w-6 h-6 ${
|
||||
!table.getCanPreviousPage() ? 'opacity-25' : 'opacity-100'
|
||||
}`}
|
||||
onClick={() => {
|
||||
if (datastoreConfig) {
|
||||
queryPaginatedData(table.getState().pagination.pageIndex - 1);
|
||||
}
|
||||
table.previousPage();
|
||||
}}
|
||||
onClick={() => table.previousPage()}
|
||||
disabled={!table.getCanPreviousPage()}
|
||||
>
|
||||
<ChevronLeftIcon />
|
||||
@@ -331,11 +176,7 @@ export const Table = ({
|
||||
className={`w-6 h-6 ${
|
||||
!table.getCanNextPage() ? 'opacity-25' : 'opacity-100'
|
||||
}`}
|
||||
onClick={() => {
|
||||
if (datastoreConfig)
|
||||
queryPaginatedData(table.getState().pagination.pageIndex + 1);
|
||||
table.nextPage();
|
||||
}}
|
||||
onClick={() => table.nextPage()}
|
||||
disabled={!table.getCanNextPage()}
|
||||
>
|
||||
<ChevronRightIcon />
|
||||
@@ -344,11 +185,7 @@ export const Table = ({
|
||||
className={`w-6 h-6 ${
|
||||
!table.getCanNextPage() ? 'opacity-25' : 'opacity-100'
|
||||
}`}
|
||||
onClick={() => {
|
||||
const pageIndexToNavigate = table.getPageCount() - 1;
|
||||
if (datastoreConfig) queryPaginatedData(pageIndexToNavigate);
|
||||
table.setPageIndex(pageIndexToNavigate);
|
||||
}}
|
||||
onClick={() => table.setPageIndex(table.getPageCount() - 1)}
|
||||
disabled={!table.getCanNextPage()}
|
||||
>
|
||||
<ChevronDoubleRightIcon />
|
||||
|
||||
@@ -1,7 +1,6 @@
|
||||
// Wrapper for the Vega component
|
||||
import { Vega as VegaOg } from "react-vega";
|
||||
import { VegaProps } from "react-vega/lib/Vega";
|
||||
|
||||
export function Vega(props: VegaProps) {
|
||||
export function Vega(props) {
|
||||
return <VegaOg {...props} />;
|
||||
}
|
||||
|
||||
@@ -1,9 +1,8 @@
|
||||
// Wrapper for the Vega Lite component
|
||||
import { VegaLite as VegaLiteOg } from 'react-vega';
|
||||
import { VegaLiteProps } from 'react-vega/lib/VegaLite';
|
||||
import applyFullWidthDirective from '../lib/applyFullWidthDirective';
|
||||
import { VegaLite as VegaLiteOg } from "react-vega";
|
||||
import applyFullWidthDirective from "../lib/applyFullWidthDirective";
|
||||
|
||||
export function VegaLite(props: VegaLiteProps) {
|
||||
export function VegaLite(props) {
|
||||
const Component = applyFullWidthDirective({ Component: VegaLiteOg });
|
||||
|
||||
return <Component {...props} />;
|
||||
|
||||
@@ -1,17 +1,10 @@
|
||||
export * from './components/Table';
|
||||
export * from './components/Catalog';
|
||||
export * from './components/LineChart';
|
||||
export * from './components/Vega';
|
||||
export * from './components/VegaLite';
|
||||
export * from './components/FlatUiTable';
|
||||
export * from './components/OpenLayers/OpenLayers';
|
||||
export * from './components/Map';
|
||||
export * from './components/PdfViewer';
|
||||
export * from "./components/Excel";
|
||||
export * from "./components/Iframe";
|
||||
export * from "./components/Plotly";
|
||||
export * from "./components/PlotlyLineChart";
|
||||
export * from "./components/PlotlyBarChart";
|
||||
// NOTE: components that are hidden for now
|
||||
// TODO: deprecate those components?
|
||||
// export * from './components/Table';
|
||||
// export * from "./components/BucketViewer";
|
||||
// export * from './components/OpenLayers/OpenLayers';
|
||||
|
||||
File diff suppressed because it is too large
Load Diff
@@ -1,18 +0,0 @@
|
||||
/*
|
||||
* All components should use this interface for
|
||||
* its data property.
|
||||
* Based on vega.
|
||||
*
|
||||
*/
|
||||
|
||||
type URL = string; // Just in case we want to transform it into an object with configurations
|
||||
export interface Data {
|
||||
url?: URL;
|
||||
values?: { [key: string]: number | string }[];
|
||||
csv?: string;
|
||||
}
|
||||
|
||||
export interface GeospatialData {
|
||||
url?: URL;
|
||||
geojson?: GeoJSON.GeoJSON;
|
||||
}
|
||||
@@ -1,100 +0,0 @@
|
||||
// NOTE: this component was renamed with .bkp so that it's hidden
|
||||
// from the Storybook app
|
||||
|
||||
import { type Meta, type StoryObj } from '@storybook/react';
|
||||
|
||||
import {
|
||||
BucketViewer,
|
||||
BucketViewerProps,
|
||||
} from '../src/components/BucketViewer';
|
||||
import LoadingSpinner from '../src/components/LoadingSpinner';
|
||||
|
||||
// More on how to set up stories at: https://storybook.js.org/docs/react/writing-stories/introduction
|
||||
const meta: Meta = {
|
||||
title: 'Components/BucketViewer',
|
||||
component: BucketViewer,
|
||||
tags: ['autodocs'],
|
||||
argTypes: {
|
||||
domain: {
|
||||
description: 'Bucket domain URI',
|
||||
},
|
||||
suffix: {
|
||||
description: 'Suffix of bucket domain',
|
||||
},
|
||||
downloadConfig: {
|
||||
description: `Bucket file download configuration`,
|
||||
},
|
||||
filterState: {
|
||||
description: `State with values used to filter the bucket files`,
|
||||
},
|
||||
paginationConfig: {
|
||||
description: `Configuration to show and stylise the pagination on the component`,
|
||||
},
|
||||
},
|
||||
};
|
||||
|
||||
export default meta;
|
||||
|
||||
type Story = StoryObj<BucketViewerProps>;
|
||||
|
||||
// More on writing stories with args: https://storybook.js.org/docs/react/writing-stories/args
|
||||
export const Normal: Story = {
|
||||
name: 'Bucket viewer',
|
||||
args: {
|
||||
domain: 'https://ssen-smart-meter.datopian.workers.dev',
|
||||
suffix: '/',
|
||||
dataMapperFn: async (rawData: Response) => {
|
||||
const result = await rawData.json();
|
||||
return result.objects.map((e) => ({
|
||||
downloadFileUri: e.downloadLink,
|
||||
fileName: e.key.replace(/^(\w+\/)/g, ''),
|
||||
dateProps: {
|
||||
date: new Date(e.uploaded),
|
||||
dateFormatter: (date) => date.toLocaleDateString(),
|
||||
},
|
||||
}));
|
||||
},
|
||||
},
|
||||
};
|
||||
|
||||
export const WithPagination: Story = {
|
||||
name: 'With pagination',
|
||||
args: {
|
||||
domain: 'https://ssen-smart-meter.datopian.workers.dev',
|
||||
suffix: '/',
|
||||
paginationConfig: {
|
||||
itemsPerPage: 3,
|
||||
},
|
||||
dataMapperFn: async (rawData: Response) => {
|
||||
const result = await rawData.json();
|
||||
return result.objects.map((e) => ({
|
||||
downloadFileUri: e.downloadLink,
|
||||
fileName: e.key.replace(/^(\w+\/)/g, ''),
|
||||
dateProps: {
|
||||
date: new Date(e.uploaded),
|
||||
dateFormatter: (date) => date.toLocaleDateString(),
|
||||
},
|
||||
}));
|
||||
},
|
||||
},
|
||||
};
|
||||
|
||||
export const WithComponentOnHoverOfEachBucketFile: Story = {
|
||||
name: 'With component on hover of each bucket file',
|
||||
args: {
|
||||
domain: 'https://ssen-smart-meter.datopian.workers.dev',
|
||||
suffix: '/',
|
||||
downloadConfig: { hoverOfTheFileComponent: `HOVER COMPONENT` },
|
||||
dataMapperFn: async (rawData: Response) => {
|
||||
const result = await rawData.json();
|
||||
return result.objects.map((e) => ({
|
||||
downloadFileUri: e.downloadLink,
|
||||
fileName: e.key.replace(/^(\w+\/)/g, ''),
|
||||
dateProps: {
|
||||
date: new Date(e.uploaded),
|
||||
dateFormatter: (date) => date.toLocaleDateString(),
|
||||
},
|
||||
}));
|
||||
},
|
||||
},
|
||||
};
|
||||
@@ -10,14 +10,11 @@ const meta: Meta = {
|
||||
argTypes: {
|
||||
datasets: {
|
||||
description:
|
||||
"Array of items to be displayed on the searchable list. Must have the following properties: \n\n \
|
||||
`_id`: item's unique id \n\n \
|
||||
`url_path`: href of the item \n\n \
|
||||
`metadata`: object with a `title` property, that will be displayed as the title of the item, together with any other custom fields that might or not be faceted.",
|
||||
'Lists of datasets to be displayed in the list, will usually be automatically available',
|
||||
},
|
||||
facets: {
|
||||
description:
|
||||
"Array of strings, which are name of properties in the datasets' `metadata`, which are going to be faceted.",
|
||||
'List of frontmatter fields that should be used as filters, needs to match exactly with the field name',
|
||||
},
|
||||
},
|
||||
};
|
||||
@@ -34,42 +31,7 @@ export const WithoutFacets: Story = {
|
||||
{
|
||||
_id: '07026b22d49916754df1dc8ffb9ccd1c31878aae',
|
||||
url_path: 'dataset-4',
|
||||
metadata: {
|
||||
title: 'Detecting Abusive Albanian',
|
||||
},
|
||||
},
|
||||
{
|
||||
_id: '42c86cf3c4fbbab11d91c2a7d6dcb8f750bc4e19',
|
||||
url_path: 'dataset-1',
|
||||
metadata: {
|
||||
title: 'AbuseEval v1.0',
|
||||
},
|
||||
},
|
||||
{
|
||||
_id: '80001dd32a752421fdcc64e91fbd237dc31d6bb3',
|
||||
url_path: 'dataset-2',
|
||||
metadata: {
|
||||
title:
|
||||
'Abusive Language Detection on Arabic Social Media (Al Jazeera)',
|
||||
},
|
||||
},
|
||||
{
|
||||
_id: '96649d05d8193f4333b10015af76c6562971bd8c',
|
||||
url_path: 'dataset-3',
|
||||
metadata: {
|
||||
title: 'CoRAL: a Context-aware Croatian Abusive Language Dataset',
|
||||
},
|
||||
},
|
||||
],
|
||||
},
|
||||
};
|
||||
export const WithFacets: Story = {
|
||||
name: 'Catalog with facets',
|
||||
args: {
|
||||
datasets: [
|
||||
{
|
||||
_id: '07026b22d49916754df1dc8ffb9ccd1c31878aae',
|
||||
url_path: 'dataset-4',
|
||||
file_path: 'content/dataset-4/index.md',
|
||||
metadata: {
|
||||
title: 'Detecting Abusive Albanian',
|
||||
'link-to-publication': 'https://arxiv.org/abs/2107.13592',
|
||||
@@ -158,6 +120,107 @@ export const WithFacets: Story = {
|
||||
},
|
||||
},
|
||||
],
|
||||
facets: ['language', 'platform'],
|
||||
},
|
||||
};
|
||||
;
|
||||
|
||||
export const WithFacets: Story = {
|
||||
name: 'Catalog with facets',
|
||||
args: {
|
||||
datasets: [
|
||||
{
|
||||
_id: '07026b22d49916754df1dc8ffb9ccd1c31878aae',
|
||||
url_path: 'dataset-4',
|
||||
file_path: 'content/dataset-4/index.md',
|
||||
metadata: {
|
||||
title: 'Detecting Abusive Albanian',
|
||||
'link-to-publication': 'https://arxiv.org/abs/2107.13592',
|
||||
'link-to-data': 'https://doi.org/10.6084/m9.figshare.19333298.v1',
|
||||
'task-description':
|
||||
'Hierarchical (offensive/not; untargeted/targeted; person/group/other)',
|
||||
'details-of-task':
|
||||
'Detect and categorise abusive language in social media data',
|
||||
'size-of-dataset': 11874,
|
||||
'percentage-abusive': 13.2,
|
||||
language: 'Albanian',
|
||||
'level-of-annotation': ['Posts'],
|
||||
platform: ['Instagram', 'Youtube'],
|
||||
medium: ['Text'],
|
||||
reference:
|
||||
'Nurce, E., Keci, J., Derczynski, L., 2021. Detecting Abusive Albanian. arXiv:2107.13592',
|
||||
},
|
||||
},
|
||||
{
|
||||
_id: '42c86cf3c4fbbab11d91c2a7d6dcb8f750bc4e19',
|
||||
url_path: 'dataset-1',
|
||||
file_path: 'content/dataset-1/index.md',
|
||||
metadata: {
|
||||
title: 'AbuseEval v1.0',
|
||||
'link-to-publication':
|
||||
'http://www.lrec-conf.org/proceedings/lrec2020/pdf/2020.lrec-1.760.pdf',
|
||||
'link-to-data': 'https://github.com/tommasoc80/AbuseEval',
|
||||
'task-description':
|
||||
'Explicitness annotation of offensive and abusive content',
|
||||
'details-of-task':
|
||||
'Enriched versions of the OffensEval/OLID dataset with the distinction of explicit/implicit offensive messages and the new dimension for abusive messages. Labels for offensive language: EXPLICIT, IMPLICT, NOT; Labels for abusive language: EXPLICIT, IMPLICT, NOTABU',
|
||||
'size-of-dataset': 14100,
|
||||
'percentage-abusive': 20.75,
|
||||
language: 'English',
|
||||
'level-of-annotation': ['Tweets'],
|
||||
platform: ['Twitter'],
|
||||
medium: ['Text'],
|
||||
reference:
|
||||
'Caselli, T., Basile, V., Jelena, M., Inga, K., and Michael, G. 2020. "I feel offended, don’t be abusive! implicit/explicit messages in offensive and abusive language". The 12th Language Resources and Evaluation Conference (pp. 6193-6202). European Language Resources Association.',
|
||||
},
|
||||
},
|
||||
{
|
||||
_id: '80001dd32a752421fdcc64e91fbd237dc31d6bb3',
|
||||
url_path: 'dataset-2',
|
||||
file_path: 'content/dataset-2/index.md',
|
||||
metadata: {
|
||||
title:
|
||||
'Abusive Language Detection on Arabic Social Media (Al Jazeera)',
|
||||
'link-to-publication': 'https://www.aclweb.org/anthology/W17-3008',
|
||||
'link-to-data':
|
||||
'http://alt.qcri.org/~hmubarak/offensive/AJCommentsClassification-CF.xlsx',
|
||||
'task-description':
|
||||
'Ternary (Obscene, Offensive but not obscene, Clean)',
|
||||
'details-of-task': 'Incivility',
|
||||
'size-of-dataset': 32000,
|
||||
'percentage-abusive': 0.81,
|
||||
language: 'Arabic',
|
||||
'level-of-annotation': ['Posts'],
|
||||
platform: ['AlJazeera'],
|
||||
medium: ['Text'],
|
||||
reference:
|
||||
'Mubarak, H., Darwish, K. and Magdy, W., 2017. Abusive Language Detection on Arabic Social Media. In: Proceedings of the First Workshop on Abusive Language Online. Vancouver, Canada: Association for Computational Linguistics, pp.52-56.',
|
||||
},
|
||||
},
|
||||
{
|
||||
_id: '96649d05d8193f4333b10015af76c6562971bd8c',
|
||||
url_path: 'dataset-3',
|
||||
file_path: 'content/dataset-3/index.md',
|
||||
metadata: {
|
||||
title: 'CoRAL: a Context-aware Croatian Abusive Language Dataset',
|
||||
'link-to-publication':
|
||||
'https://aclanthology.org/2022.findings-aacl.21/',
|
||||
'link-to-data':
|
||||
'https://github.com/shekharRavi/CoRAL-dataset-Findings-of-the-ACL-AACL-IJCNLP-2022',
|
||||
'task-description':
|
||||
'Multi-class based on context dependency categories (CDC)',
|
||||
'details-of-task': 'Detectioning CDC from abusive comments',
|
||||
'size-of-dataset': 2240,
|
||||
'percentage-abusive': 100,
|
||||
language: 'Croatian',
|
||||
'level-of-annotation': ['Posts'],
|
||||
platform: ['Posts'],
|
||||
medium: ['Newspaper Comments'],
|
||||
reference:
|
||||
'Ravi Shekhar, Mladen Karan and Matthew Purver (2022). CoRAL: a Context-aware Croatian Abusive Language Dataset. Findings of the ACL: AACL-IJCNLP.',
|
||||
},
|
||||
},
|
||||
],
|
||||
facets: ['language', 'platform']
|
||||
},
|
||||
};
|
||||
;
|
||||
|
||||
@@ -4,13 +4,13 @@ import { Excel, ExcelProps } from '../src/components/Excel';
|
||||
|
||||
// More on how to set up stories at: https://storybook.js.org/docs/react/writing-stories/introduction
|
||||
const meta: Meta = {
|
||||
title: 'Components/Tabular/Excel',
|
||||
title: 'Components/Excel',
|
||||
component: Excel,
|
||||
tags: ['autodocs'],
|
||||
argTypes: {
|
||||
data: {
|
||||
url: {
|
||||
description:
|
||||
'Object with a `url` property pointing to the Excel file to be displayed, e.g.: `{ url: "https://url.to/data.csv" }`',
|
||||
'Url of the file to be displayed e.g.: "https://url.to/data.csv"',
|
||||
},
|
||||
},
|
||||
};
|
||||
@@ -22,17 +22,13 @@ type Story = StoryObj<ExcelProps>;
|
||||
export const SingleSheet: Story = {
|
||||
name: 'Excel file with just one sheet',
|
||||
args: {
|
||||
data: {
|
||||
url: 'https://sheetjs.com/pres.xlsx',
|
||||
},
|
||||
url: 'https://sheetjs.com/pres.xlsx',
|
||||
},
|
||||
};
|
||||
|
||||
export const MultipleSheet: Story = {
|
||||
name: 'Excel file with multiple sheets',
|
||||
args: {
|
||||
data: {
|
||||
url: 'https://storage.portaljs.org/IC-Gantt-Chart-Project-Template-8857.xlsx',
|
||||
},
|
||||
url: 'https://storage.portaljs.org/IC-Gantt-Chart-Project-Template-8857.xlsx',
|
||||
},
|
||||
};
|
||||
|
||||
@@ -4,31 +4,29 @@ import { FlatUiTable, FlatUiTableProps } from '../src/components/FlatUiTable';
|
||||
|
||||
// More on how to set up stories at: https://storybook.js.org/docs/react/writing-stories/introduction
|
||||
const meta: Meta = {
|
||||
title: 'Components/Tabular/FlatUiTable',
|
||||
title: 'Components/FlatUiTable',
|
||||
component: FlatUiTable,
|
||||
tags: ['autodocs'],
|
||||
argTypes: {
|
||||
data: {
|
||||
description:
|
||||
'Data to be displayed. \n\n \
|
||||
Must be an object with one of the following properties: `url`, `values` or `csv` \n\n \
|
||||
`url`: URL pointing to a CSV file. \n\n \
|
||||
`values`: array of objects. \n\n \
|
||||
`csv`: string with valid CSV. \n\n \
|
||||
',
|
||||
'Data to be displayed in the table, must be setup as an array of key value pairs',
|
||||
},
|
||||
csv: {
|
||||
description: 'CSV data as string.',
|
||||
},
|
||||
url: {
|
||||
description:
|
||||
'Fetch the data from a CSV file remotely. only the first 5MB of data will be displayed',
|
||||
},
|
||||
bytes: {
|
||||
description:
|
||||
'Fetch the data from a CSV file remotely. Only the first <bytes> of data will be displayed. Defaults to 5MB.',
|
||||
'Fetch the data from a CSV file remotely. only the first <bytes> of data will be displayed',
|
||||
},
|
||||
parsingConfig: {
|
||||
description:
|
||||
'Configuration for parsing the CSV data. See https://www.papaparse.com/docs#config for more details',
|
||||
},
|
||||
uniqueId: {
|
||||
description:
|
||||
'Provide a unique ID to help with cache revalidation of the fetched data.',
|
||||
},
|
||||
},
|
||||
};
|
||||
|
||||
@@ -38,40 +36,34 @@ type Story = StoryObj<FlatUiTableProps>;
|
||||
|
||||
// More on writing stories with args: https://storybook.js.org/docs/react/writing-stories/args
|
||||
export const FromColumnsAndData: Story = {
|
||||
name: 'Table from array or objects',
|
||||
name: 'Table data',
|
||||
args: {
|
||||
data: {
|
||||
values: [
|
||||
{ id: 1, lastName: 'Snow', firstName: 'Jon', age: 35 },
|
||||
{ id: 2, lastName: 'Lannister', firstName: 'Cersei', age: 42 },
|
||||
{ id: 3, lastName: 'Lannister', firstName: 'Jaime', age: 45 },
|
||||
{ id: 4, lastName: 'Stark', firstName: 'Arya', age: 16 },
|
||||
{ id: 7, lastName: 'Clifford', firstName: 'Ferrara', age: 44 },
|
||||
{ id: 8, lastName: 'Frances', firstName: 'Rossini', age: 36 },
|
||||
{ id: 9, lastName: 'Roxie', firstName: 'Harvey', age: 65 },
|
||||
],
|
||||
},
|
||||
data: [
|
||||
{ id: 1, lastName: 'Snow', firstName: 'Jon', age: 35 },
|
||||
{ id: 2, lastName: 'Lannister', firstName: 'Cersei', age: 42 },
|
||||
{ id: 3, lastName: 'Lannister', firstName: 'Jaime', age: 45 },
|
||||
{ id: 4, lastName: 'Stark', firstName: 'Arya', age: 16 },
|
||||
{ id: 7, lastName: 'Clifford', firstName: 'Ferrara', age: 44 },
|
||||
{ id: 8, lastName: 'Frances', firstName: 'Rossini', age: 36 },
|
||||
{ id: 9, lastName: 'Roxie', firstName: 'Harvey', age: 65 },
|
||||
],
|
||||
},
|
||||
};
|
||||
|
||||
export const FromRawCSV: Story = {
|
||||
name: 'Table from inline CSV',
|
||||
name: 'Table from raw CSV',
|
||||
args: {
|
||||
data: {
|
||||
csv: `
|
||||
rawCsv: `
|
||||
Year,Temp Anomaly
|
||||
1850,-0.418
|
||||
2020,0.923
|
||||
`,
|
||||
},
|
||||
},
|
||||
};
|
||||
|
||||
export const FromURL: Story = {
|
||||
name: 'Table from URL',
|
||||
args: {
|
||||
data: {
|
||||
url: 'https://storage.openspending.org/alberta-budget/__os_imported__alberta_total.csv',
|
||||
},
|
||||
url: 'https://ckan-dev.sse.datopian.com/datastore/dump/601c9cf0-595e-46d8-88fc-d1ab2904e2db',
|
||||
},
|
||||
};
|
||||
|
||||
@@ -1,33 +0,0 @@
|
||||
import { type Meta, type StoryObj } from '@storybook/react';
|
||||
|
||||
import { Iframe, IframeProps } from '../src/components/Iframe';
|
||||
|
||||
const meta: Meta = {
|
||||
title: 'Components/Embedding/Iframe',
|
||||
component: Iframe,
|
||||
tags: ['autodocs'],
|
||||
argTypes: {
|
||||
data: {
|
||||
description:
|
||||
'Object with a `url` property pointing to the page to be embeded.',
|
||||
},
|
||||
style: {
|
||||
description:
|
||||
'Style object of the component. See example at https://react.dev/learn#displaying-data. Defaults to `{ width: "100%", height: "100%" }`',
|
||||
},
|
||||
},
|
||||
};
|
||||
|
||||
export default meta;
|
||||
|
||||
type Story = StoryObj<IframeProps>;
|
||||
|
||||
export const Normal: Story = {
|
||||
name: 'Iframe',
|
||||
args: {
|
||||
data: {
|
||||
url: 'https://app.powerbi.com/view?r=eyJrIjoiYzBmN2Q2MzYtYzE3MS00ODkxLWE5OWMtZTQ2MjBlMDljMDk4IiwidCI6Ijk1M2IwZjgzLTFjZTYtNDVjMy04MmM5LTFkODQ3ZTM3MjMzOSIsImMiOjh9',
|
||||
},
|
||||
style: { width: `100%`, height: `600px` },
|
||||
},
|
||||
};
|
||||
@@ -4,6 +4,6 @@ import { Meta } from '@storybook/blocks';
|
||||
|
||||
# Welcome to the PortalJS components guide
|
||||
|
||||
**Official Website:** [portaljs.com](https://portaljs.com)
|
||||
**Docs:** [portaljs.com/opensource](https://portaljs.com/opensource)
|
||||
**Official Website:** [portaljs.org](https://portaljs.org)
|
||||
**Docs:** [portaljs.org/docs](https://portaljs.org/docs)
|
||||
**GitHub:** [github.com/datopian/portaljs](https://github.com/datopian/portaljs)
|
||||
@@ -4,40 +4,37 @@ import { LineChart, LineChartProps } from '../src/components/LineChart';
|
||||
|
||||
// More on how to set up stories at: https://storybook.js.org/docs/react/writing-stories/introduction
|
||||
const meta: Meta = {
|
||||
title: 'Components/Charts/LineChart',
|
||||
title: 'Components/LineChart',
|
||||
component: LineChart,
|
||||
tags: ['autodocs'],
|
||||
argTypes: {
|
||||
data: {
|
||||
description:
|
||||
'Data to be displayed. \n\n \
|
||||
Must be an object with one of the following properties: `url` or `values` \n\n \
|
||||
`url`: URL pointing to a CSV file. \n\n \
|
||||
`values`: array of objects \n\n',
|
||||
'Data to be displayed.\n\n E.g.: [["1990", 1], ["1991", 2]] \n\nOR\n\n "https://url.to/data.csv"',
|
||||
},
|
||||
title: {
|
||||
description: 'Title to display on the chart.',
|
||||
description: 'Title to display on the chart. Optional.',
|
||||
},
|
||||
xAxis: {
|
||||
description:
|
||||
'Name of the column header or object property that represents the X-axis on the data.',
|
||||
'Name of the X axis on the data. Required when the "data" parameter is an URL.',
|
||||
},
|
||||
xAxisType: {
|
||||
description: 'Type of the X-axis.',
|
||||
description: 'Type of the X axis',
|
||||
},
|
||||
xAxisTimeUnit: {
|
||||
description: 'Time unit of the X-axis, in case its type is `temporal.`',
|
||||
description: 'Time unit of the X axis (optional)',
|
||||
},
|
||||
yAxis: {
|
||||
description:
|
||||
'Name of the column headers or object properties that represent the Y-axis on the data.',
|
||||
'Name of the Y axis on the data. Required when the "data" parameter is an URL.',
|
||||
},
|
||||
yAxisType: {
|
||||
description: 'Type of the Y-axis',
|
||||
description: 'Type of the Y axis',
|
||||
},
|
||||
symbol: {
|
||||
fullWidth: {
|
||||
description:
|
||||
'Name of the column header or object property that represents a series for multiple series.',
|
||||
'Whether the component should be rendered as full bleed or not',
|
||||
},
|
||||
},
|
||||
};
|
||||
@@ -50,72 +47,21 @@ type Story = StoryObj<LineChartProps>;
|
||||
export const FromDataPoints: Story = {
|
||||
name: 'Line chart from array of data points',
|
||||
args: {
|
||||
data: {
|
||||
values: [
|
||||
{ year: '1850', value: -0.41765878 },
|
||||
{ year: '1851', value: -0.2333498 },
|
||||
{ year: '1852', value: -0.22939907 },
|
||||
{ year: '1853', value: -0.27035445 },
|
||||
{ year: '1854', value: -0.29163003 },
|
||||
],
|
||||
},
|
||||
xAxis: 'year',
|
||||
yAxis: 'value',
|
||||
},
|
||||
};
|
||||
|
||||
export const MultiSeries: Story = {
|
||||
name: 'Line chart with multiple series (specifying symbol)',
|
||||
args: {
|
||||
data: {
|
||||
values: [
|
||||
{ year: '1850', value: -0.41765878, z: 'A' },
|
||||
{ year: '1851', value: -0.2333498, z: 'A' },
|
||||
{ year: '1852', value: -0.22939907, z: 'A' },
|
||||
{ year: '1853', value: -0.27035445, z: 'A' },
|
||||
{ year: '1854', value: -0.29163003, z: 'A' },
|
||||
{ year: '1850', value: -0.42993882, z: 'B' },
|
||||
{ year: '1851', value: -0.30365549, z: 'B' },
|
||||
{ year: '1852', value: -0.27905189, z: 'B' },
|
||||
{ year: '1853', value: -0.22939704, z: 'B' },
|
||||
{ year: '1854', value: -0.25688013, z: 'B' },
|
||||
{ year: '1850', value: -0.4757164, z: 'C' },
|
||||
{ year: '1851', value: -0.41971018, z: 'C' },
|
||||
{ year: '1852', value: -0.40724799, z: 'C' },
|
||||
{ year: '1853', value: -0.45049156, z: 'C' },
|
||||
{ year: '1854', value: -0.41896583, z: 'C' },
|
||||
],
|
||||
},
|
||||
xAxis: 'year',
|
||||
yAxis: 'value',
|
||||
symbol: 'z',
|
||||
},
|
||||
};
|
||||
|
||||
export const MultiColumns: Story = {
|
||||
name: 'Line chart with multiple series (with multiple columns)',
|
||||
args: {
|
||||
data: {
|
||||
values: [
|
||||
{ year: '1850', A: -0.41765878, B: -0.42993882, C: -0.4757164 },
|
||||
{ year: '1851', A: -0.2333498, B: -0.30365549, C: -0.41971018 },
|
||||
{ year: '1852', A: -0.22939907, B: -0.27905189, C: -0.40724799 },
|
||||
{ year: '1853', A: -0.27035445, B: -0.22939704, C: -0.45049156 },
|
||||
{ year: '1854', A: -0.29163003, B: -0.25688013, C: -0.41896583 },
|
||||
],
|
||||
},
|
||||
xAxis: 'year',
|
||||
yAxis: ['A', 'B', 'C'],
|
||||
data: [
|
||||
['1850', -0.41765878],
|
||||
['1851', -0.2333498],
|
||||
['1852', -0.22939907],
|
||||
['1853', -0.27035445],
|
||||
['1854', -0.29163003],
|
||||
],
|
||||
},
|
||||
};
|
||||
|
||||
export const FromURL: Story = {
|
||||
name: 'Line chart from URL',
|
||||
args: {
|
||||
data: {
|
||||
url: 'https://raw.githubusercontent.com/datasets/oil-prices/main/data/wti-year.csv',
|
||||
},
|
||||
title: 'Oil Price x Year',
|
||||
data: 'https://raw.githubusercontent.com/datasets/oil-prices/main/data/wti-year.csv',
|
||||
xAxis: 'Date',
|
||||
yAxis: 'Price',
|
||||
},
|
||||
|
||||
@@ -4,33 +4,22 @@ import { Map, MapProps } from '../src/components/Map';
|
||||
|
||||
// More on how to set up stories at: https://storybook.js.org/docs/react/writing-stories/introduction
|
||||
const meta: Meta = {
|
||||
title: 'Components/Geospatial/Map',
|
||||
title: 'Components/Map',
|
||||
component: Map,
|
||||
tags: ['autodocs'],
|
||||
argTypes: {
|
||||
layers: {
|
||||
description:
|
||||
'Array of layers to be displayed on the map. Should be an object with: \n\n \
|
||||
`data`: object with either a `url` property pointing to a GeoJSON file or a `geojson` property with a GeoJSON object. \n\n \
|
||||
`name`: name of the layer. \n\n \
|
||||
`colorscale`: object with a `starting` and `ending` colors that will be used to create a gradient and color the map. \n\n \
|
||||
`tooltip`: `true` to show all available features on the tooltip, object with a `propNames` property as an array of strings to choose which features to display. \n\n',
|
||||
'Data to be displayed.\n\n GeoJSON Object \n\nOR\n\n URL to GeoJSON Object',
|
||||
},
|
||||
title: {
|
||||
description: 'Title to display on the map.',
|
||||
description: 'Title to display on the map. Optional.',
|
||||
},
|
||||
center: {
|
||||
description: 'Initial coordinates of the center of the map',
|
||||
},
|
||||
zoom: {
|
||||
description: 'Initial zoom level',
|
||||
},
|
||||
style: {
|
||||
description: "CSS styles to be applied to the map's container.",
|
||||
},
|
||||
autoZoomConfiguration: {
|
||||
description:
|
||||
"Pass a layer's name to automatically zoom to the bounding area of a layer.",
|
||||
description: 'Zoom level',
|
||||
},
|
||||
},
|
||||
};
|
||||
@@ -43,15 +32,9 @@ type Story = StoryObj<MapProps>;
|
||||
export const GeoJSONPolygons: Story = {
|
||||
name: 'GeoJSON polygons map',
|
||||
args: {
|
||||
tileLayerName:'MapBox',
|
||||
tileLayerOptions:{
|
||||
accessToken : 'pk.eyJ1Ijoid2lsbHktcGFsbWFyZWpvIiwiYSI6ImNqNzk5NmRpNDFzb2cyeG9sc2luMHNjajUifQ.lkoVRFSI8hOLH4uJeOzwXw',
|
||||
},
|
||||
layers: [
|
||||
{
|
||||
data: {
|
||||
url: 'https://d2ad6b4ur7yvpq.cloudfront.net/naturalearth-3.3.0/ne_10m_geography_marine_polys.geojson',
|
||||
},
|
||||
data: 'https://d2ad6b4ur7yvpq.cloudfront.net/naturalearth-3.3.0/ne_10m_geography_marine_polys.geojson',
|
||||
name: 'Polygons',
|
||||
tooltip: { propNames: ['name'] },
|
||||
colorScale: {
|
||||
@@ -71,9 +54,7 @@ export const GeoJSONPoints: Story = {
|
||||
args: {
|
||||
layers: [
|
||||
{
|
||||
data: {
|
||||
url: 'https://opendata.arcgis.com/datasets/9c58741995174fbcb017cf46c8a42f4b_25.geojson',
|
||||
},
|
||||
data: 'https://opendata.arcgis.com/datasets/9c58741995174fbcb017cf46c8a42f4b_25.geojson',
|
||||
name: 'Points',
|
||||
tooltip: { propNames: ['Location'] },
|
||||
},
|
||||
@@ -89,16 +70,12 @@ export const GeoJSONMultipleLayers: Story = {
|
||||
args: {
|
||||
layers: [
|
||||
{
|
||||
data: {
|
||||
url: 'https://opendata.arcgis.com/datasets/9c58741995174fbcb017cf46c8a42f4b_25.geojson',
|
||||
},
|
||||
data: 'https://opendata.arcgis.com/datasets/9c58741995174fbcb017cf46c8a42f4b_25.geojson',
|
||||
name: 'Points',
|
||||
tooltip: true,
|
||||
},
|
||||
{
|
||||
data: {
|
||||
url: 'https://d2ad6b4ur7yvpq.cloudfront.net/naturalearth-3.3.0/ne_10m_geography_marine_polys.geojson',
|
||||
},
|
||||
data: 'https://d2ad6b4ur7yvpq.cloudfront.net/naturalearth-3.3.0/ne_10m_geography_marine_polys.geojson',
|
||||
name: 'Polygons',
|
||||
tooltip: true,
|
||||
colorScale: {
|
||||
@@ -112,35 +89,3 @@ export const GeoJSONMultipleLayers: Story = {
|
||||
zoom: 2,
|
||||
},
|
||||
};
|
||||
|
||||
export const GeoJSONMultipleLayersWithAutoZoomInSpecifiedLayer: Story = {
|
||||
name: 'GeoJSON polygons and points map with auto zoom in the points layer',
|
||||
args: {
|
||||
layers: [
|
||||
{
|
||||
data: {
|
||||
url: 'https://opendata.arcgis.com/datasets/9c58741995174fbcb017cf46c8a42f4b_25.geojson',
|
||||
},
|
||||
name: 'Points',
|
||||
tooltip: true,
|
||||
},
|
||||
{
|
||||
data: {
|
||||
url: 'https://d2ad6b4ur7yvpq.cloudfront.net/naturalearth-3.3.0/ne_10m_geography_marine_polys.geojson',
|
||||
},
|
||||
name: 'Polygons',
|
||||
tooltip: true,
|
||||
colorScale: {
|
||||
starting: '#ff0000',
|
||||
ending: '#00ff00',
|
||||
},
|
||||
},
|
||||
],
|
||||
title: 'Polygons and points',
|
||||
center: { latitude: 45, longitude: 0 },
|
||||
zoom: 2,
|
||||
autoZoomConfiguration: {
|
||||
layerName: 'Points',
|
||||
},
|
||||
},
|
||||
};
|
||||
|
||||
@@ -1,6 +1,3 @@
|
||||
// NOTE: this component was renamed with .bkp so that it's hidden
|
||||
// from the Storybook app
|
||||
|
||||
import type { Meta, StoryObj } from '@storybook/react';
|
||||
import React from 'react';
|
||||
import OpenLayers from '../src/components/OpenLayers/OpenLayers';
|
||||
@@ -3,21 +3,19 @@ import type { Meta, StoryObj } from '@storybook/react';
|
||||
import { PdfViewer, PdfViewerProps } from '../src/components/PdfViewer';
|
||||
|
||||
const meta: Meta = {
|
||||
title: 'Components/Embedding/PdfViewer',
|
||||
title: 'Components/PdfViewer',
|
||||
component: PdfViewer,
|
||||
tags: ['autodocs'],
|
||||
argTypes: {
|
||||
data: {
|
||||
description:
|
||||
'Object with a `url` property pointing to the PDF file to be displayed, e.g.: `{ url: "https://cdn.filestackcontent.com/wcrjf9qPTCKXV3hMXDwK" }`.',
|
||||
url: {
|
||||
description: 'URL to PDF file',
|
||||
},
|
||||
parentClassName: {
|
||||
description:
|
||||
'HTML classes to be applied to the container of the PDF viewer. [Tailwind](https://tailwindcss.com/) classes, such as `h-96` to define the height of the component, can be used on this field.',
|
||||
description: 'Classname for the parent div of the pdf viewer',
|
||||
},
|
||||
layout: {
|
||||
layour: {
|
||||
description:
|
||||
'Set to `true` if you want to display a layout with zoom level, page count, printing button and other controls.',
|
||||
'Set to true if you want to have a layout with zoom level, page count, printing button etc',
|
||||
defaultValue: false,
|
||||
},
|
||||
},
|
||||
@@ -27,23 +25,26 @@ export default meta;
|
||||
|
||||
type Story = StoryObj<PdfViewerProps>;
|
||||
|
||||
export const PdfViewerStoryWithoutControlsLayout: Story = {
|
||||
name: 'PDF Viewer without controls layout',
|
||||
export const PdfViewerStory: Story = {
|
||||
name: 'PdfViewer',
|
||||
args: {
|
||||
data: {
|
||||
url: 'https://cdn.filestackcontent.com/wcrjf9qPTCKXV3hMXDwK',
|
||||
},
|
||||
parentClassName: 'h-96',
|
||||
url: 'https://cdn.filestackcontent.com/wcrjf9qPTCKXV3hMXDwK',
|
||||
},
|
||||
};
|
||||
|
||||
export const PdfViewerStoryWithControlsLayout: Story = {
|
||||
name: 'PdfViewer with controls layout',
|
||||
export const PdfViewerStoryWithLayout: Story = {
|
||||
name: 'PdfViewer with the default layout',
|
||||
args: {
|
||||
data: {
|
||||
url: 'https://cdn.filestackcontent.com/wcrjf9qPTCKXV3hMXDwK',
|
||||
},
|
||||
url: 'https://cdn.filestackcontent.com/wcrjf9qPTCKXV3hMXDwK',
|
||||
layout: true,
|
||||
},
|
||||
};
|
||||
|
||||
export const PdfViewerStoryWithHeight: Story = {
|
||||
name: 'PdfViewer with a custom height',
|
||||
args: {
|
||||
url: 'https://cdn.filestackcontent.com/wcrjf9qPTCKXV3hMXDwK',
|
||||
parentClassName: 'h-96',
|
||||
layout: true,
|
||||
parentClassName: 'h-96',
|
||||
},
|
||||
};
|
||||
|
||||
@@ -1,49 +0,0 @@
|
||||
import type { Meta, StoryObj } from '@storybook/react';
|
||||
|
||||
import { Plotly } from '../src/components/Plotly';
|
||||
|
||||
// More on how to set up stories at: https://storybook.js.org/docs/react/writing-stories/introduction
|
||||
const meta: Meta = {
|
||||
title: 'Components/Charts/Plotly',
|
||||
component: Plotly,
|
||||
tags: ['autodocs'],
|
||||
argTypes: {
|
||||
data: {
|
||||
description:
|
||||
"Plotly's `data` prop. You can find references on how to use these props at https://github.com/plotly/react-plotly.js/#basic-props.",
|
||||
},
|
||||
layout: {
|
||||
description:
|
||||
"Plotly's `layout` prop. You can find references on how to use these props at https://github.com/plotly/react-plotly.js/#basic-props.",
|
||||
},
|
||||
},
|
||||
};
|
||||
|
||||
export default meta;
|
||||
|
||||
type Story = StoryObj<any>;
|
||||
|
||||
// More on writing stories with args: https://storybook.js.org/docs/react/writing-stories/args
|
||||
export const Primary: Story = {
|
||||
name: 'Line chart',
|
||||
args: {
|
||||
data: [
|
||||
{
|
||||
x: [1, 2, 3],
|
||||
y: [2, 6, 3],
|
||||
type: 'scatter',
|
||||
mode: 'lines+markers',
|
||||
marker: { color: 'red' },
|
||||
},
|
||||
],
|
||||
layout: {
|
||||
title: 'Chart built with Plotly',
|
||||
xaxis: {
|
||||
title: 'x Axis',
|
||||
},
|
||||
yaxis: {
|
||||
title: 'y Axis',
|
||||
},
|
||||
},
|
||||
},
|
||||
};
|
||||
@@ -1,102 +0,0 @@
|
||||
import type { Meta, StoryObj } from '@storybook/react';
|
||||
|
||||
import {
|
||||
PlotlyBarChart,
|
||||
PlotlyBarChartProps,
|
||||
} from '../src/components/PlotlyBarChart';
|
||||
|
||||
// More on how to set up stories at: https://storybook.js.org/docs/react/writing-stories/introduction
|
||||
const meta: Meta = {
|
||||
title: 'Components/Charts/PlotlyBarChart',
|
||||
component: PlotlyBarChart,
|
||||
tags: ['autodocs'],
|
||||
argTypes: {
|
||||
data: {
|
||||
description:
|
||||
'Data to be displayed. \n\n \
|
||||
Must be an object with one of the following properties: `url`, `values` or `csv` \n\n \
|
||||
`url`: URL pointing to a CSV file. \n\n \
|
||||
`values`: array of objects (check out [this example](/?path=/story/components-plotlybarchart--from-data-points)) \n\n \
|
||||
`csv`: string with valid CSV (check out [this example](/?path=/story/components-plotlybarchart--from-inline-csv)) \n\n \
|
||||
',
|
||||
},
|
||||
bytes: {
|
||||
// TODO: likely this should be an extra option on the data parameter,
|
||||
// specific to URLs
|
||||
description:
|
||||
"How many bytes to read from the url so that the entire file doesn's have to be fetched.",
|
||||
},
|
||||
parsingConfig: {
|
||||
description:
|
||||
'If using URL or CSV, this parsing config will be used to parse the data. Check https://www.papaparse.com/ for more info.',
|
||||
},
|
||||
title: {
|
||||
description: 'Title to display on the chart.',
|
||||
},
|
||||
// TODO: commented out because this doesn't work
|
||||
// lineLabel: {
|
||||
// description:
|
||||
// 'Label to display on the line, Optional, will use yAxis if not provided',
|
||||
// },
|
||||
xAxis: {
|
||||
description:
|
||||
'Name of the column header or object property that represents the X-axis on the data.',
|
||||
},
|
||||
yAxis: {
|
||||
description:
|
||||
'Name of the column header or object property that represents the Y-axis on the data.',
|
||||
},
|
||||
uniqueId: {
|
||||
description: 'Provide a unique ID to help with cache revalidation of the fetched data.'
|
||||
}
|
||||
},
|
||||
};
|
||||
|
||||
export default meta;
|
||||
|
||||
type Story = StoryObj<PlotlyBarChartProps>;
|
||||
|
||||
export const FromDataPoints: Story = {
|
||||
name: 'Bar chart from array of data points',
|
||||
args: {
|
||||
data: {
|
||||
values: [
|
||||
{ year: '1850', temperature: -0.41765878 },
|
||||
{ year: '1851', temperature: -0.2333498 },
|
||||
{ year: '1852', temperature: -0.22939907 },
|
||||
{ year: '1853', temperature: -0.27035445 },
|
||||
{ year: '1854', temperature: -0.29163003 },
|
||||
],
|
||||
},
|
||||
xAxis: 'year',
|
||||
yAxis: 'temperature',
|
||||
},
|
||||
};
|
||||
|
||||
export const FromURL: Story = {
|
||||
name: 'Bar chart from URL',
|
||||
args: {
|
||||
title: 'Apple Stock Prices',
|
||||
data: {
|
||||
url: 'https://raw.githubusercontent.com/plotly/datasets/master/finance-charts-apple.csv',
|
||||
},
|
||||
xAxis: 'Date',
|
||||
yAxis: 'AAPL.Open',
|
||||
},
|
||||
};
|
||||
|
||||
export const FromInlineCSV: Story = {
|
||||
name: 'Bar chart from inline CSV',
|
||||
args: {
|
||||
title: 'Apple Stock Prices',
|
||||
data: {
|
||||
csv: `Date,AAPL.Open,AAPL.High,AAPL.Low,AAPL.Close,AAPL.Volume,AAPL.Adjusted,dn,mavg,up,direction
|
||||
2015-02-17,127.489998,128.880005,126.919998,127.830002,63152400,122.905254,106.7410523,117.9276669,129.1142814,Increasing
|
||||
2015-02-18,127.629997,128.779999,127.449997,128.720001,44891700,123.760965,107.842423,118.9403335,130.0382439,Increasing
|
||||
2015-02-19,128.479996,129.029999,128.330002,128.449997,37362400,123.501363,108.8942449,119.8891668,130.8840887,Decreasing
|
||||
2015-02-20,128.619995,129.5,128.050003,129.5,48948400,124.510914,109.7854494,120.7635001,131.7415509,Increasing`,
|
||||
},
|
||||
xAxis: 'Date',
|
||||
yAxis: 'AAPL.Open',
|
||||
},
|
||||
};
|
||||
@@ -1,101 +0,0 @@
|
||||
import type { Meta, StoryObj } from '@storybook/react';
|
||||
|
||||
import {
|
||||
PlotlyLineChart,
|
||||
PlotlyLineChartProps,
|
||||
} from '../src/components/PlotlyLineChart';
|
||||
|
||||
const meta: Meta = {
|
||||
title: 'Components/Charts/PlotlyLineChart',
|
||||
component: PlotlyLineChart,
|
||||
tags: ['autodocs'],
|
||||
argTypes: {
|
||||
data: {
|
||||
description:
|
||||
'Data to be displayed. \n\n \
|
||||
Must be an object with one of the following properties: `url`, `values` or `csv` \n\n \
|
||||
`url`: URL pointing to a CSV file. \n\n \
|
||||
`values`: array of objects. \n\n \
|
||||
`csv`: string with valid CSV. \n\n \
|
||||
',
|
||||
},
|
||||
bytes: {
|
||||
// TODO: likely this should be an extra option on the data parameter,
|
||||
// specific to URLs
|
||||
description:
|
||||
"How many bytes to read from the url so that the entire file doesn's have to be fetched.",
|
||||
},
|
||||
parsingConfig: {
|
||||
description:
|
||||
'If using URL or CSV, this parsing config will be used to parse the data. Check https://www.papaparse.com/ for more info',
|
||||
},
|
||||
title: {
|
||||
description: 'Title to display on the chart.',
|
||||
},
|
||||
lineLabel: {
|
||||
description:
|
||||
'Label to display on the line, will use yAxis if not provided',
|
||||
},
|
||||
xAxis: {
|
||||
description:
|
||||
'Name of the column header or object property that represents the X-axis on the data.',
|
||||
},
|
||||
yAxis: {
|
||||
description:
|
||||
'Name of the column header or object property that represents the Y-axis on the data.',
|
||||
},
|
||||
uniqueId: {
|
||||
description:
|
||||
'Provide a unique ID to help with cache revalidation of the fetched data.',
|
||||
},
|
||||
},
|
||||
};
|
||||
|
||||
export default meta;
|
||||
|
||||
type Story = StoryObj<PlotlyLineChartProps>;
|
||||
|
||||
export const FromDataPoints: Story = {
|
||||
name: 'Line chart from array of data points',
|
||||
args: {
|
||||
data: {
|
||||
values: [
|
||||
{ year: '1850', temperature: -0.41765878 },
|
||||
{ year: '1851', temperature: -0.2333498 },
|
||||
{ year: '1852', temperature: -0.22939907 },
|
||||
{ year: '1853', temperature: -0.27035445 },
|
||||
{ year: '1854', temperature: -0.29163003 },
|
||||
],
|
||||
},
|
||||
xAxis: 'year',
|
||||
yAxis: 'temperature',
|
||||
},
|
||||
};
|
||||
|
||||
export const FromURL: Story = {
|
||||
name: 'Line chart from URL',
|
||||
args: {
|
||||
title: 'Oil Price x Year',
|
||||
data: {
|
||||
url: 'https://raw.githubusercontent.com/datasets/oil-prices/main/data/wti-year.csv',
|
||||
},
|
||||
xAxis: 'Date',
|
||||
yAxis: 'Price',
|
||||
},
|
||||
};
|
||||
|
||||
export const FromInlineCSV: Story = {
|
||||
name: 'Bar chart from inline CSV',
|
||||
args: {
|
||||
title: 'Apple Stock Prices',
|
||||
data: {
|
||||
csv: `Date,AAPL.Open,AAPL.High,AAPL.Low,AAPL.Close,AAPL.Volume,AAPL.Adjusted,dn,mavg,up,direction
|
||||
2015-02-17,127.489998,128.880005,126.919998,127.830002,63152400,122.905254,106.7410523,117.9276669,129.1142814,Increasing
|
||||
2015-02-18,127.629997,128.779999,127.449997,128.720001,44891700,123.760965,107.842423,118.9403335,130.0382439,Increasing
|
||||
2015-02-19,128.479996,129.029999,128.330002,128.449997,37362400,123.501363,108.8942449,119.8891668,130.8840887,Decreasing
|
||||
2015-02-20,128.619995,129.5,128.050003,129.5,48948400,124.510914,109.7854494,120.7635001,131.7415509,Increasing`,
|
||||
},
|
||||
xAxis: 'Date',
|
||||
yAxis: 'AAPL.Open',
|
||||
},
|
||||
};
|
||||
@@ -1,33 +1,25 @@
|
||||
// NOTE: this component was renamed with .bkp so that it's hidden
|
||||
// from the Storybook app
|
||||
|
||||
import type { Meta, StoryObj } from '@storybook/react';
|
||||
|
||||
import { Table, TableProps } from '../src/components/Table';
|
||||
|
||||
// More on how to set up stories at: https://storybook.js.org/docs/react/writing-stories/introduction
|
||||
const meta: Meta = {
|
||||
title: 'Components/Tabular/Table',
|
||||
title: 'Components/Table',
|
||||
component: Table,
|
||||
tags: ['autodocs'],
|
||||
argTypes: {
|
||||
data: {
|
||||
description:
|
||||
'Data to be displayed in the table, must also set "cols" to work.',
|
||||
description: "Data to be displayed in the table, must also set \"cols\" to work."
|
||||
},
|
||||
cols: {
|
||||
description:
|
||||
'Columns to be displayed in the table, must also set "data" to work.',
|
||||
description: "Columns to be displayed in the table, must also set \"data\" to work."
|
||||
},
|
||||
csv: {
|
||||
description: 'CSV data as string.',
|
||||
description: "CSV data as string.",
|
||||
},
|
||||
url: {
|
||||
description: 'Fetch the data from a CSV file remotely.',
|
||||
},
|
||||
datastoreConfig: {
|
||||
description: `Configuration to use CKAN's datastore API extension integrated with the component`,
|
||||
},
|
||||
description: "Fetch the data from a CSV file remotely."
|
||||
}
|
||||
},
|
||||
};
|
||||
|
||||
@@ -37,7 +29,7 @@ type Story = StoryObj<TableProps>;
|
||||
|
||||
// More on writing stories with args: https://storybook.js.org/docs/react/writing-stories/args
|
||||
export const FromColumnsAndData: Story = {
|
||||
name: 'Table from columns and data',
|
||||
name: "Table from columns and data",
|
||||
args: {
|
||||
data: [
|
||||
{ id: 1, lastName: 'Snow', firstName: 'Jon', age: 35 },
|
||||
@@ -57,40 +49,21 @@ export const FromColumnsAndData: Story = {
|
||||
},
|
||||
};
|
||||
|
||||
export const WithDataStoreIntegration: Story = {
|
||||
name: 'Table with datastore integration',
|
||||
args: {
|
||||
datastoreConfig: {
|
||||
dataStoreURI: `https://www.civicdata.com/api/action/datastore_search?resource_id=46ec0807-31ff-497f-bfa0-f31c796cdee8`,
|
||||
dataMapperFn: ({
|
||||
result,
|
||||
}: {
|
||||
result: { fields: { id }[]; records: []; total: number };
|
||||
}) => {
|
||||
return {
|
||||
data: result.records,
|
||||
cols: result.fields.map((x) => ({ key: x.id, name: x.id })),
|
||||
total: result.total,
|
||||
};
|
||||
},
|
||||
},
|
||||
},
|
||||
};
|
||||
|
||||
export const FromRawCSV: Story = {
|
||||
name: 'Table from raw CSV',
|
||||
name: "Table from raw CSV",
|
||||
args: {
|
||||
csv: `
|
||||
Year,Temp Anomaly
|
||||
1850,-0.418
|
||||
2020,0.923
|
||||
`,
|
||||
},
|
||||
`
|
||||
}
|
||||
};
|
||||
|
||||
export const FromURL: Story = {
|
||||
name: 'Table from URL',
|
||||
name: "Table from URL",
|
||||
args: {
|
||||
url: 'https://raw.githubusercontent.com/datasets/finance-vix/main/data/vix-daily.csv',
|
||||
},
|
||||
url: "https://raw.githubusercontent.com/datasets/finance-vix/main/data/vix-daily.csv"
|
||||
}
|
||||
};
|
||||
|
||||
@@ -4,19 +4,9 @@ import { Vega } from '../src/components/Vega';
|
||||
|
||||
// More on how to set up stories at: https://storybook.js.org/docs/react/writing-stories/introduction
|
||||
const meta: Meta = {
|
||||
title: 'Components/Charts/Vega',
|
||||
title: 'Components/Vega',
|
||||
component: Vega,
|
||||
tags: ['autodocs'],
|
||||
argTypes: {
|
||||
data: {
|
||||
description:
|
||||
"Vega's `data` prop. You can find references on how to use this prop at https://vega.github.io/vega/docs/data/",
|
||||
},
|
||||
spec: {
|
||||
description:
|
||||
"Vega's `spec` prop. You can find references on how to use this prop at https://vega.github.io/vega/docs/specification/",
|
||||
},
|
||||
},
|
||||
};
|
||||
|
||||
export default meta;
|
||||
@@ -25,7 +15,7 @@ type Story = StoryObj<any>;
|
||||
|
||||
// More on writing stories with args: https://storybook.js.org/docs/react/writing-stories/args
|
||||
export const Primary: Story = {
|
||||
name: 'Bar chart',
|
||||
name: 'Chart built with Vega',
|
||||
args: {
|
||||
data: {
|
||||
table: [
|
||||
|
||||
@@ -4,7 +4,7 @@ import { VegaLite } from '../src/components/VegaLite';
|
||||
|
||||
// More on how to set up stories at: https://storybook.js.org/docs/react/writing-stories/introduction
|
||||
const meta: Meta = {
|
||||
title: 'Components/Charts/VegaLite',
|
||||
title: 'Components/VegaLite',
|
||||
component: VegaLite,
|
||||
tags: ['autodocs'],
|
||||
argTypes: {
|
||||
@@ -25,7 +25,7 @@ type Story = StoryObj<any>;
|
||||
|
||||
// More on writing stories with args: https://storybook.js.org/docs/react/writing-stories/args
|
||||
export const Primary: Story = {
|
||||
name: 'Bar chart',
|
||||
name: 'Chart built with Vega Lite',
|
||||
args: {
|
||||
data: {
|
||||
table: [
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
{
|
||||
"name": "@portaljs/core",
|
||||
"version": "1.0.9",
|
||||
"version": "1.0.8",
|
||||
"description": "Core Portal.JS components, configs and utils.",
|
||||
"repository": {
|
||||
"type": "git",
|
||||
|
||||
@@ -53,7 +53,7 @@ export const Nav: React.FC<Props> = ({
|
||||
<nav className="flex justify-between">
|
||||
{/* Mobile navigation */}
|
||||
<div className="mr-2 sm:mr-4 flex lg:hidden">
|
||||
<NavMobile {...{title, links, social, search, defaultTheme, themeToggleIcon}}>{children}</NavMobile>
|
||||
<NavMobile links={links}>{children}</NavMobile>
|
||||
</div>
|
||||
{/* Non-mobile navigation */}
|
||||
<div className="flex flex-none items-center">
|
||||
|
||||
@@ -4,16 +4,20 @@ import { useRouter } from "next/router.js";
|
||||
import { useEffect, useState } from "react";
|
||||
import { SearchContext, SearchField } from "../Search";
|
||||
import { MenuIcon, CloseIcon } from "../Icons";
|
||||
import type { NavConfig, ThemeConfig } from "./Nav";
|
||||
import { NavLink, SearchProviderConfig } from "../types";
|
||||
|
||||
interface Props extends NavConfig, ThemeConfig, React.PropsWithChildren {}
|
||||
interface Props extends React.PropsWithChildren {
|
||||
author?: string;
|
||||
links?: Array<NavLink>;
|
||||
search?: SearchProviderConfig;
|
||||
}
|
||||
|
||||
// TODO: Search doesn't appear
|
||||
// TODO why mobile navigation only accepts author and regular nav accepts different things like title, logo, version
|
||||
export const NavMobile: React.FC<Props> = ({
|
||||
children,
|
||||
title,
|
||||
links,
|
||||
search,
|
||||
author,
|
||||
}) => {
|
||||
const router = useRouter();
|
||||
const [isOpen, setIsOpen] = useState(false);
|
||||
@@ -73,8 +77,8 @@ export const NavMobile: React.FC<Props> = ({
|
||||
legacyBehavior
|
||||
>
|
||||
{/* <Logomark className="h-9 w-9" /> */}
|
||||
<div className="font-extrabold text-primary dark:text-primary-dark text-lg ml-6">
|
||||
{title}
|
||||
<div className="font-extrabold text-primary dark:text-primary-dark text-2xl ml-6">
|
||||
{author}
|
||||
</div>
|
||||
</Link>
|
||||
</div>
|
||||
@@ -102,7 +106,9 @@ export const NavMobile: React.FC<Props> = ({
|
||||
))}
|
||||
</ul>
|
||||
)}
|
||||
<div className="pt-6">{children}</div>
|
||||
{/* <div className="pt-6 border border-t-2">
|
||||
{children}
|
||||
</div> */}
|
||||
</Dialog.Panel>
|
||||
</Dialog>
|
||||
</>
|
||||
|
||||
@@ -46,8 +46,8 @@ export const SiteToc: React.FC<Props> = ({ currentPath, nav }) => {
|
||||
|
||||
return (
|
||||
<nav data-testid="lhs-sidebar" className="flex flex-col space-y-3 text-sm">
|
||||
{sortNavGroupChildren(nav).map((n, index) => (
|
||||
<NavComponent key={index} item={n} isActive={false} />
|
||||
{sortNavGroupChildren(nav).map((n) => (
|
||||
<NavComponent item={n} isActive={false} />
|
||||
))}
|
||||
</nav>
|
||||
);
|
||||
@@ -96,8 +96,8 @@ const NavComponent: React.FC<{
|
||||
leaveTo="transform scale-95 opacity-0"
|
||||
>
|
||||
<Disclosure.Panel className="flex flex-col space-y-3 pl-5 mt-3">
|
||||
{sortNavGroupChildren(item.children).map((subItem, index) => (
|
||||
<NavComponent key={index} item={subItem} isActive={false} />
|
||||
{sortNavGroupChildren(item.children).map((subItem) => (
|
||||
<NavComponent item={subItem} isActive={false} />
|
||||
))}
|
||||
</Disclosure.Panel>
|
||||
</Transition>
|
||||
|
||||
@@ -1,36 +0,0 @@
|
||||
import Script from 'next/script.js'
|
||||
|
||||
export interface GoogleAnalyticsProps {
|
||||
googleAnalyticsId: string
|
||||
}
|
||||
|
||||
export const GA = ({ googleAnalyticsId }: GoogleAnalyticsProps) => {
|
||||
return (
|
||||
<>
|
||||
<Script
|
||||
strategy="afterInteractive"
|
||||
src={`https://www.googletagmanager.com/gtag/js?id=${googleAnalyticsId}`}
|
||||
/>
|
||||
|
||||
<Script strategy="afterInteractive" id="ga-script">
|
||||
{`
|
||||
window.dataLayer = window.dataLayer || [];
|
||||
function gtag(){dataLayer.push(arguments);}
|
||||
gtag('js', new Date());
|
||||
gtag('config', '${googleAnalyticsId}');
|
||||
`}
|
||||
</Script>
|
||||
</>
|
||||
)
|
||||
}
|
||||
|
||||
// https://developers.google.com/analytics/devguides/collection/gtagjs/events
|
||||
export const logEvent = (action, category, label, value) => {
|
||||
// eslint-disable-next-line @typescript-eslint/ban-ts-comment
|
||||
// @ts-ignore
|
||||
window.gtag?.('event', action, {
|
||||
event_category: category,
|
||||
event_label: label,
|
||||
value: value,
|
||||
})
|
||||
}
|
||||
@@ -1,41 +0,0 @@
|
||||
import Script from 'next/script.js'
|
||||
|
||||
export interface PlausibleProps {
|
||||
plausibleDataDomain: string
|
||||
dataApi?: string
|
||||
src?: string
|
||||
}
|
||||
|
||||
/**
|
||||
* Plausible analytics component.
|
||||
* To proxy the requests through your own domain, you can use the dataApi and src attribute.
|
||||
* See [Plausible docs](https://plausible.io/docs/proxy/guides/nextjs#step-2-adjust-your-deployed-script)
|
||||
* for more information.
|
||||
*
|
||||
*/
|
||||
export const Plausible = ({
|
||||
plausibleDataDomain,
|
||||
dataApi = undefined,
|
||||
src = 'https://plausible.io/js/plausible.js',
|
||||
}: PlausibleProps) => {
|
||||
return (
|
||||
<>
|
||||
<Script
|
||||
strategy="lazyOnload"
|
||||
data-domain={plausibleDataDomain}
|
||||
data-api={dataApi}
|
||||
src={src}
|
||||
/>
|
||||
<Script strategy="lazyOnload" id="plausible-script">
|
||||
{`
|
||||
window.plausible = window.plausible || function() { (window.plausible.q = window.plausible.q || []).push(arguments) }
|
||||
`}
|
||||
</Script>
|
||||
</>
|
||||
)
|
||||
}
|
||||
|
||||
// https://plausible.io/docs/custom-event-goals
|
||||
export const logEvent = (eventName, ...rest) => {
|
||||
return window.plausible?.(eventName, ...rest)
|
||||
}
|
||||
@@ -1,25 +0,0 @@
|
||||
import Script from 'next/script.js'
|
||||
|
||||
export interface PosthogProps {
|
||||
posthogProjectApiKey: string
|
||||
apiHost?: string
|
||||
}
|
||||
|
||||
/**
|
||||
* Posthog analytics component.
|
||||
* See [Posthog docs](https://posthog.com/docs/libraries/js#option-1-add-javascript-snippet-to-your-html-badgerecommendedbadge) for more information.
|
||||
*
|
||||
*/
|
||||
export const Posthog = ({
|
||||
posthogProjectApiKey,
|
||||
apiHost = 'https://app.posthog.com',
|
||||
}: PosthogProps) => {
|
||||
return (
|
||||
<Script strategy="lazyOnload" id="posthog-script">
|
||||
{`
|
||||
!function(t,e){var o,n,p,r;e.__SV||(window.posthog=e,e._i=[],e.init=function(i,s,a){function g(t,e){var o=e.split(".");2==o.length&&(t=t[o[0]],e=o[1]),t[e]=function(){t.push([e].concat(Array.prototype.slice.call(arguments,0)))}}(p=t.createElement("script")).type="text/javascript",p.async=!0,p.src=s.api_host+"/static/array.js",(r=t.getElementsByTagName("script")[0]).parentNode.insertBefore(p,r);var u=e;for(void 0!==a?u=e[a]=[]:a="posthog",u.people=u.people||[],u.toString=function(t){var e="posthog";return"posthog"!==a&&(e+="."+a),t||(e+=" (stub)"),e},u.people.toString=function(){return u.toString(1)+".people (stub)"},o="capture identify alias people.set people.set_once set_config register register_once unregister opt_out_capturing has_opted_out_capturing opt_in_capturing reset isFeatureEnabled onFeatureFlags".split(" "),n=0;n<o.length;n++)g(u,o[n]);e._i.push([i,s,a])},e.__SV=1)}(document,window.posthog||[]);
|
||||
posthog.init('${posthogProjectApiKey}',{api_host:'${apiHost}'})
|
||||
`}
|
||||
</Script>
|
||||
)
|
||||
}
|
||||
@@ -1,29 +0,0 @@
|
||||
import Script from 'next/script.js'
|
||||
|
||||
export interface SimpleAnalyticsProps {
|
||||
src?: string
|
||||
}
|
||||
|
||||
export const SimpleAnalytics = ({
|
||||
src = 'https://scripts.simpleanalyticscdn.com/latest.js',
|
||||
}: SimpleAnalyticsProps) => {
|
||||
return (
|
||||
<>
|
||||
<Script strategy="lazyOnload" id="sa-script">
|
||||
{`
|
||||
window.sa_event=window.sa_event||function(){var a=[].slice.call(arguments);window.sa_event.q?window.sa_event.q.push(a):window.sa_event.q=[a]};
|
||||
`}
|
||||
</Script>
|
||||
<Script strategy="lazyOnload" src={src} />
|
||||
</>
|
||||
)
|
||||
}
|
||||
|
||||
// https://docs.simpleanalytics.com/events
|
||||
export const logEvent = (eventName, callback) => {
|
||||
if (callback) {
|
||||
return window.sa_event?.(eventName, callback)
|
||||
} else {
|
||||
return window.sa_event?.(eventName)
|
||||
}
|
||||
}
|
||||
@@ -1,20 +0,0 @@
|
||||
import Script from 'next/script.js'
|
||||
|
||||
export interface UmamiProps {
|
||||
umamiWebsiteId: string
|
||||
src?: string
|
||||
}
|
||||
|
||||
export const Umami = ({
|
||||
umamiWebsiteId,
|
||||
src = 'https://analytics.umami.is/script.js',
|
||||
}: UmamiProps) => {
|
||||
return (
|
||||
<Script
|
||||
async
|
||||
defer
|
||||
data-website-id={umamiWebsiteId}
|
||||
src={src} // Replace with your umami instance
|
||||
/>
|
||||
)
|
||||
}
|
||||
@@ -1,82 +0,0 @@
|
||||
/* eslint-disable @typescript-eslint/no-explicit-any */
|
||||
import { GA, GoogleAnalyticsProps } from "./GoogleAnalytics";
|
||||
import { Plausible, PlausibleProps } from "./Plausible";
|
||||
import { SimpleAnalytics, SimpleAnalyticsProps } from "./SimpleAnalytics";
|
||||
import { Umami, UmamiProps } from "./Umami";
|
||||
import { Posthog, PosthogProps } from "./Posthog";
|
||||
|
||||
declare global {
|
||||
interface Window {
|
||||
// eslint-disable-next-line @typescript-eslint/ban-ts-comment
|
||||
// @ts-ignore
|
||||
gtag?: (...args: any[]) => void;
|
||||
plausible?: (...args: any[]) => void;
|
||||
sa_event?: (...args: any[]) => void;
|
||||
}
|
||||
}
|
||||
|
||||
export interface AnalyticsConfig {
|
||||
googleAnalytics?: GoogleAnalyticsProps;
|
||||
plausibleAnalytics?: PlausibleProps;
|
||||
umamiAnalytics?: UmamiProps;
|
||||
posthogAnalytics?: PosthogProps;
|
||||
simpleAnalytics?: SimpleAnalyticsProps;
|
||||
}
|
||||
|
||||
/**
|
||||
* @example
|
||||
* const analytics: AnalyticsConfig = {
|
||||
* plausibleDataDomain: '', // e.g. tailwind-nextjs-starter-blog.vercel.app
|
||||
* simpleAnalytics: false, // true or false
|
||||
* umamiWebsiteId: '', // e.g. 123e4567-e89b-12d3-a456-426614174000
|
||||
* posthogProjectApiKey: '', // e.g. AhnJK8392ndPOav87as450xd
|
||||
* googleAnalyticsId: '', // e.g. UA-000000-2 or G-XXXXXXX
|
||||
* }
|
||||
*/
|
||||
export interface AnalyticsProps {
|
||||
analyticsConfig: AnalyticsConfig;
|
||||
}
|
||||
|
||||
const isProduction = true || process.env["NODE_ENV"] === "production";
|
||||
|
||||
/**
|
||||
* Supports Plausible, Simple Analytics, Umami, Posthog or Google Analytics.
|
||||
* All components default to the hosted service, but can be configured to use a self-hosted
|
||||
* or proxied version of the script by providing the `src` / `apiHost` props.
|
||||
*
|
||||
* Note: If you want to use an analytics provider you have to add it to the
|
||||
* content security policy in the `next.config.js` file.
|
||||
* @param {AnalyticsProps} { analytics }
|
||||
* @return {*}
|
||||
*/
|
||||
export const Analytics = ({ analyticsConfig }: AnalyticsProps) => {
|
||||
return (
|
||||
<>
|
||||
{isProduction && analyticsConfig.plausibleAnalytics && (
|
||||
<Plausible {...analyticsConfig.plausibleAnalytics} />
|
||||
)}
|
||||
{isProduction && analyticsConfig.simpleAnalytics && (
|
||||
<SimpleAnalytics {...analyticsConfig.simpleAnalytics} />
|
||||
)}
|
||||
{isProduction && analyticsConfig.posthogAnalytics && (
|
||||
<Posthog {...analyticsConfig.posthogAnalytics} />
|
||||
)}
|
||||
{isProduction && analyticsConfig.umamiAnalytics && (
|
||||
<Umami {...analyticsConfig.umamiAnalytics} />
|
||||
)}
|
||||
{isProduction && analyticsConfig.googleAnalytics && (
|
||||
<GA {...analyticsConfig.googleAnalytics} />
|
||||
)}
|
||||
</>
|
||||
);
|
||||
};
|
||||
|
||||
export { GA, Plausible, SimpleAnalytics, Umami, Posthog };
|
||||
|
||||
export type {
|
||||
GoogleAnalyticsProps,
|
||||
PlausibleProps,
|
||||
UmamiProps,
|
||||
PosthogProps,
|
||||
SimpleAnalyticsProps,
|
||||
};
|
||||
@@ -21,4 +21,3 @@ export { SiteToc, NavItem, NavGroup } from "./SiteToc";
|
||||
export { Comments, CommentsConfig } from "./Comments";
|
||||
export { AuthorConfig } from "./types";
|
||||
export { Hero } from "./Hero";
|
||||
export { Analytics, AnalyticsConfig } from "./analytics";
|
||||
|
||||
@@ -7,8 +7,6 @@ export const pageview = ({
|
||||
analyticsID: string;
|
||||
}) => {
|
||||
if (typeof window.gtag !== undefined) {
|
||||
// eslint-disable-next-line @typescript-eslint/ban-ts-comment
|
||||
// @ts-ignore
|
||||
window.gtag("config", analyticsID, {
|
||||
page_path: url,
|
||||
});
|
||||
@@ -18,8 +16,6 @@ export const pageview = ({
|
||||
// https://developers.google.com/analytics/devguides/collection/gtagjs/events
|
||||
export const event = ({ action, category, label, value }) => {
|
||||
if (typeof window.gtag !== undefined) {
|
||||
// eslint-disable-next-line @typescript-eslint/ban-ts-comment
|
||||
// @ts-ignore
|
||||
window.gtag("event", action, {
|
||||
event_category: category,
|
||||
event_label: label,
|
||||
|
||||
@@ -1,17 +1,5 @@
|
||||
# @portaljs/remark-wiki-link
|
||||
|
||||
## 1.2.0
|
||||
|
||||
### Minor Changes
|
||||
|
||||
- [#1084](https://github.com/datopian/datahub/pull/1084) [`57952e08`](https://github.com/datopian/datahub/commit/57952e0817770138881e7492dc9f43e9910b56a8) Thanks [@mohamedsalem401](https://github.com/mohamedsalem401)! - Add image resize feature
|
||||
|
||||
## 1.1.2
|
||||
|
||||
### Patch Changes
|
||||
|
||||
- [#1040](https://github.com/datopian/portaljs/pull/1040) [`85bb6cb9`](https://github.com/datopian/portaljs/commit/85bb6cb98c53bedc2add3d014927570b5dd1bbdf) Thanks [@Gutts-n](https://github.com/Gutts-n)! - Changed regex to permit any symbols other than #
|
||||
|
||||
## 1.1.1
|
||||
|
||||
### Patch Changes
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
{
|
||||
"name": "@portaljs/remark-wiki-link",
|
||||
"version": "1.2.0",
|
||||
"version": "1.1.1",
|
||||
"description": "Parse and render wiki-style links in markdown especially Obsidian style links.",
|
||||
"repository": {
|
||||
"type": "git",
|
||||
|
||||
@@ -1,23 +1,23 @@
|
||||
import { isSupportedFileFormat } from './isSupportedFileFormat';
|
||||
import { isSupportedFileFormat } from "./isSupportedFileFormat";
|
||||
|
||||
const defaultWikiLinkResolver = (target: string) => {
|
||||
// for [[#heading]] links
|
||||
if (!target) {
|
||||
return [];
|
||||
}
|
||||
let permalink = target.replace(/\/index$/, '');
|
||||
let permalink = target.replace(/\/index$/, "");
|
||||
// TODO what to do with [[index]] link?
|
||||
if (permalink.length === 0) {
|
||||
permalink = '/';
|
||||
permalink = "/";
|
||||
}
|
||||
return [permalink];
|
||||
};
|
||||
|
||||
export interface FromMarkdownOptions {
|
||||
pathFormat?:
|
||||
| 'raw' // default; use for regular relative or absolute paths
|
||||
| 'obsidian-absolute' // use for Obsidian-style absolute paths (with no leading slash)
|
||||
| 'obsidian-short'; // use for Obsidian-style shortened paths (shortest path possible)
|
||||
| "raw" // default; use for regular relative or absolute paths
|
||||
| "obsidian-absolute" // use for Obsidian-style absolute paths (with no leading slash)
|
||||
| "obsidian-short"; // use for Obsidian-style shortened paths (shortest path possible)
|
||||
permalinks?: string[]; // list of permalinks to match possible permalinks of a wiki link against
|
||||
wikiLinkResolver?: (name: string) => string[]; // function to resolve wiki links to an array of possible permalinks
|
||||
newClassName?: string; // class name to add to links that don't have a matching permalink
|
||||
@@ -25,23 +25,14 @@ export interface FromMarkdownOptions {
|
||||
hrefTemplate?: (permalink: string) => string; // function to generate the href attribute of a link
|
||||
}
|
||||
|
||||
export function getImageSize(size: string) {
|
||||
// eslint-disable-next-line prefer-const
|
||||
let [width, height] = size.split('x');
|
||||
|
||||
if (!height) height = width;
|
||||
|
||||
return { width, height };
|
||||
}
|
||||
|
||||
// mdas-util-from-markdown extension
|
||||
// https://github.com/syntax-tree/mdast-util-from-markdown#extension
|
||||
function fromMarkdown(opts: FromMarkdownOptions = {}) {
|
||||
const pathFormat = opts.pathFormat || 'raw';
|
||||
const pathFormat = opts.pathFormat || "raw";
|
||||
const permalinks = opts.permalinks || [];
|
||||
const wikiLinkResolver = opts.wikiLinkResolver || defaultWikiLinkResolver;
|
||||
const newClassName = opts.newClassName || 'new';
|
||||
const wikiLinkClassName = opts.wikiLinkClassName || 'internal';
|
||||
const newClassName = opts.newClassName || "new";
|
||||
const wikiLinkClassName = opts.wikiLinkClassName || "internal";
|
||||
const defaultHrefTemplate = (permalink: string) => permalink;
|
||||
|
||||
const hrefTemplate = opts.hrefTemplate || defaultHrefTemplate;
|
||||
@@ -53,9 +44,9 @@ function fromMarkdown(opts: FromMarkdownOptions = {}) {
|
||||
function enterWikiLink(token) {
|
||||
this.enter(
|
||||
{
|
||||
type: 'wikiLink',
|
||||
type: "wikiLink",
|
||||
data: {
|
||||
isEmbed: token.isType === 'embed',
|
||||
isEmbed: token.isType === "embed",
|
||||
target: null, // the target of the link, e.g. "Foo Bar#Heading" in "[[Foo Bar#Heading]]"
|
||||
alias: null, // the alias of the link, e.g. "Foo" in "[[Foo Bar|Foo]]"
|
||||
permalink: null, // TODO shouldn't this be named just "link"?
|
||||
@@ -88,19 +79,19 @@ function fromMarkdown(opts: FromMarkdownOptions = {}) {
|
||||
data: { isEmbed, target, alias },
|
||||
} = wikiLink;
|
||||
// eslint-disable-next-line no-useless-escape
|
||||
const wikiLinkWithHeadingPattern = /^(.*?)(#.*)?$/u;
|
||||
const [, path, heading = ''] = target.match(wikiLinkWithHeadingPattern);
|
||||
const wikiLinkWithHeadingPattern = /([\p{Letter}\d\s\/\.-_]*)(#.*)?/u;
|
||||
const [, path, heading = ""] = target.match(wikiLinkWithHeadingPattern);
|
||||
|
||||
const possibleWikiLinkPermalinks = wikiLinkResolver(path);
|
||||
|
||||
const matchingPermalink = permalinks.find((e) => {
|
||||
return possibleWikiLinkPermalinks.find((p) => {
|
||||
if (pathFormat === 'obsidian-short') {
|
||||
if (pathFormat === "obsidian-short") {
|
||||
if (e === p || e.endsWith(p)) {
|
||||
return true;
|
||||
}
|
||||
} else if (pathFormat === 'obsidian-absolute') {
|
||||
if (e === '/' + p) {
|
||||
} else if (pathFormat === "obsidian-absolute") {
|
||||
if (e === "/" + p) {
|
||||
return true;
|
||||
}
|
||||
} else {
|
||||
@@ -115,19 +106,20 @@ function fromMarkdown(opts: FromMarkdownOptions = {}) {
|
||||
// TODO this is ugly
|
||||
const link =
|
||||
matchingPermalink ||
|
||||
(pathFormat === 'obsidian-absolute'
|
||||
? '/' + possibleWikiLinkPermalinks[0]
|
||||
(pathFormat === "obsidian-absolute"
|
||||
? "/" + possibleWikiLinkPermalinks[0]
|
||||
: possibleWikiLinkPermalinks[0]) ||
|
||||
'';
|
||||
"";
|
||||
|
||||
wikiLink.data.exists = !!matchingPermalink;
|
||||
wikiLink.data.permalink = link;
|
||||
|
||||
// remove leading # if the target is a heading on the same page
|
||||
const displayName = alias || target.replace(/^#/, '');
|
||||
const headingId = heading.replace(/\s+/g, '-').toLowerCase();
|
||||
const displayName = alias || target.replace(/^#/, "");
|
||||
const headingId = heading.replace(/\s+/, "-").toLowerCase();
|
||||
let classNames = wikiLinkClassName;
|
||||
if (!matchingPermalink) {
|
||||
classNames += ' ' + newClassName;
|
||||
classNames += " " + newClassName;
|
||||
}
|
||||
|
||||
if (isEmbed) {
|
||||
@@ -135,55 +127,44 @@ function fromMarkdown(opts: FromMarkdownOptions = {}) {
|
||||
if (!isSupportedFormat) {
|
||||
// Temporarily render note transclusion as a regular wiki link
|
||||
if (!format) {
|
||||
wikiLink.data.hName = 'a';
|
||||
wikiLink.data.hName = "a";
|
||||
wikiLink.data.hProperties = {
|
||||
className: classNames + ' ' + 'transclusion',
|
||||
className: classNames + " " + "transclusion",
|
||||
href: hrefTemplate(link) + headingId,
|
||||
};
|
||||
wikiLink.data.hChildren = [{ type: 'text', value: displayName }];
|
||||
wikiLink.data.hChildren = [{ type: "text", value: displayName }];
|
||||
|
||||
} else {
|
||||
wikiLink.data.hName = 'p';
|
||||
wikiLink.data.hName = "p";
|
||||
wikiLink.data.hChildren = [
|
||||
{
|
||||
type: 'text',
|
||||
type: "text",
|
||||
value: `![[${target}]]`,
|
||||
},
|
||||
];
|
||||
}
|
||||
} else if (format === 'pdf') {
|
||||
wikiLink.data.hName = 'iframe';
|
||||
} else if (format === "pdf") {
|
||||
wikiLink.data.hName = "iframe";
|
||||
wikiLink.data.hProperties = {
|
||||
className: classNames,
|
||||
width: '100%',
|
||||
width: "100%",
|
||||
src: `${hrefTemplate(link)}#toolbar=0`,
|
||||
};
|
||||
} else {
|
||||
const hasDimensions = alias && /^\d+(x\d+)?$/.test(alias);
|
||||
// Take the target as alt text except if alt name was provided [[target|alt text]]
|
||||
const altText = hasDimensions || !alias ? target : alias;
|
||||
|
||||
wikiLink.data.hName = 'img';
|
||||
wikiLink.data.hProperties = {
|
||||
className: classNames,
|
||||
src: hrefTemplate(link),
|
||||
alt: altText
|
||||
};
|
||||
|
||||
if (hasDimensions) {
|
||||
const { width, height } = getImageSize(alias as string);
|
||||
Object.assign(wikiLink.data.hProperties, {
|
||||
width,
|
||||
height,
|
||||
});
|
||||
}
|
||||
wikiLink.data.hName = "img";
|
||||
wikiLink.data.hProperties = {
|
||||
className: classNames,
|
||||
src: hrefTemplate(link),
|
||||
alt: displayName,
|
||||
};
|
||||
}
|
||||
} else {
|
||||
wikiLink.data.hName = 'a';
|
||||
wikiLink.data.hName = "a";
|
||||
wikiLink.data.hProperties = {
|
||||
className: classNames,
|
||||
href: hrefTemplate(link) + headingId,
|
||||
};
|
||||
wikiLink.data.hChildren = [{ type: 'text', value: displayName }];
|
||||
wikiLink.data.hChildren = [{ type: "text", value: displayName }];
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
@@ -1,24 +1,23 @@
|
||||
import { getImageSize } from './fromMarkdown';
|
||||
import { isSupportedFileFormat } from './isSupportedFileFormat';
|
||||
import { isSupportedFileFormat } from "./isSupportedFileFormat";
|
||||
|
||||
const defaultWikiLinkResolver = (target: string) => {
|
||||
// for [[#heading]] links
|
||||
if (!target) {
|
||||
return [];
|
||||
}
|
||||
let permalink = target.replace(/\/index$/, '');
|
||||
let permalink = target.replace(/\/index$/, "");
|
||||
// TODO what to do with [[index]] link?
|
||||
if (permalink.length === 0) {
|
||||
permalink = '/';
|
||||
permalink = "/";
|
||||
}
|
||||
return [permalink];
|
||||
};
|
||||
|
||||
export interface HtmlOptions {
|
||||
pathFormat?:
|
||||
| 'raw' // default; use for regular relative or absolute paths
|
||||
| 'obsidian-absolute' // use for Obsidian-style absolute paths (with no leading slash)
|
||||
| 'obsidian-short'; // use for Obsidian-style shortened paths (shortest path possible)
|
||||
| "raw" // default; use for regular relative or absolute paths
|
||||
| "obsidian-absolute" // use for Obsidian-style absolute paths (with no leading slash)
|
||||
| "obsidian-short"; // use for Obsidian-style shortened paths (shortest path possible)
|
||||
permalinks?: string[]; // list of permalinks to match possible permalinks of a wiki link against
|
||||
wikiLinkResolver?: (name: string) => string[]; // function to resolve wiki links to an array of possible permalinks
|
||||
newClassName?: string; // class name to add to links that don't have a matching permalink
|
||||
@@ -29,11 +28,11 @@ export interface HtmlOptions {
|
||||
// Micromark HtmlExtension
|
||||
// https://github.com/micromark/micromark#htmlextension
|
||||
function html(opts: HtmlOptions = {}) {
|
||||
const pathFormat = opts.pathFormat || 'raw';
|
||||
const pathFormat = opts.pathFormat || "raw";
|
||||
const permalinks = opts.permalinks || [];
|
||||
const wikiLinkResolver = opts.wikiLinkResolver || defaultWikiLinkResolver;
|
||||
const newClassName = opts.newClassName || 'new';
|
||||
const wikiLinkClassName = opts.wikiLinkClassName || 'internal';
|
||||
const newClassName = opts.newClassName || "new";
|
||||
const wikiLinkClassName = opts.wikiLinkClassName || "internal";
|
||||
const defaultHrefTemplate = (permalink: string) => permalink;
|
||||
const hrefTemplate = opts.hrefTemplate || defaultHrefTemplate;
|
||||
|
||||
@@ -42,21 +41,21 @@ function html(opts: HtmlOptions = {}) {
|
||||
}
|
||||
|
||||
function enterWikiLink() {
|
||||
let stack = this.getData('wikiLinkStack');
|
||||
if (!stack) this.setData('wikiLinkStack', (stack = []));
|
||||
let stack = this.getData("wikiLinkStack");
|
||||
if (!stack) this.setData("wikiLinkStack", (stack = []));
|
||||
|
||||
stack.push({});
|
||||
}
|
||||
|
||||
function exitWikiLinkTarget(token) {
|
||||
const target = this.sliceSerialize(token);
|
||||
const current = top(this.getData('wikiLinkStack'));
|
||||
const current = top(this.getData("wikiLinkStack"));
|
||||
current.target = target;
|
||||
}
|
||||
|
||||
function exitWikiLinkAlias(token) {
|
||||
const alias = this.sliceSerialize(token);
|
||||
const current = top(this.getData('wikiLinkStack'));
|
||||
const current = top(this.getData("wikiLinkStack"));
|
||||
current.alias = alias;
|
||||
}
|
||||
|
||||
@@ -65,7 +64,7 @@ function html(opts: HtmlOptions = {}) {
|
||||
const { target, alias } = wikiLink;
|
||||
const isEmbed = token.isType === "embed";
|
||||
// eslint-disable-next-line no-useless-escape
|
||||
const wikiLinkWithHeadingPattern = /^(.*?)(#.*)?$/u;
|
||||
const wikiLinkWithHeadingPattern = /([\w\s\/\.-]*)(#.*)?/;
|
||||
const [, path, heading = ""] = target.match(wikiLinkWithHeadingPattern);
|
||||
|
||||
const possibleWikiLinkPermalinks = wikiLinkResolver(path);
|
||||
@@ -100,7 +99,7 @@ function html(opts: HtmlOptions = {}) {
|
||||
// remove leading # if the target is a heading on the same page
|
||||
const displayName = alias || target.replace(/^#/, "");
|
||||
// replace spaces with dashes and lowercase headings
|
||||
const headingId = heading.replace(/\s+/g, "-").toLowerCase();
|
||||
const headingId = heading.replace(/\s+/, "-").toLowerCase();
|
||||
let classNames = wikiLinkClassName;
|
||||
if (!matchingPermalink) {
|
||||
classNames += " " + newClassName;
|
||||
@@ -112,9 +111,7 @@ function html(opts: HtmlOptions = {}) {
|
||||
// Temporarily render note transclusion as a regular wiki link
|
||||
if (!format) {
|
||||
this.tag(
|
||||
`<a href="${hrefTemplate(
|
||||
link + headingId
|
||||
)}" class="${classNames} transclusion">`
|
||||
`<a href="${hrefTemplate(link + headingId)}" class="${classNames} transclusion">`
|
||||
);
|
||||
this.raw(displayName);
|
||||
this.tag("</a>");
|
||||
@@ -128,18 +125,11 @@ function html(opts: HtmlOptions = {}) {
|
||||
)}#toolbar=0" class="${classNames}" />`
|
||||
);
|
||||
} else {
|
||||
const hasDimensions = alias && /^\d+(x\d+)?$/.test(alias);
|
||||
// Take the target as alt text except if alt name was provided [[target|alt text]]
|
||||
const altText = hasDimensions || !alias ? target : alias;
|
||||
let imgAttributes = `src="${hrefTemplate(
|
||||
link
|
||||
)}" alt="${altText}" class="${classNames}"`;
|
||||
|
||||
if (hasDimensions) {
|
||||
const { width, height } = getImageSize(alias as string);
|
||||
imgAttributes += ` width="${width}" height="${height}"`;
|
||||
}
|
||||
this.tag(`<img ${imgAttributes} />`);
|
||||
this.tag(
|
||||
`<img src="${hrefTemplate(
|
||||
link
|
||||
)}" alt="${displayName}" class="${classNames}" />`
|
||||
);
|
||||
}
|
||||
} else {
|
||||
this.tag(
|
||||
|
||||
@@ -38,5 +38,6 @@ const defaultPathToPermalinkFunc = (
|
||||
.replace(markdownFolder, "") // make the permalink relative to the markdown folder
|
||||
.replace(/\.(mdx|md)/, "")
|
||||
.replace(/\\/g, "/") // replace windows backslash with forward slash
|
||||
.replace(/\/index$/, ""); // remove index from the end of the permalink
|
||||
return permalink.length > 0 ? permalink : "/"; // for home page
|
||||
};
|
||||
|
||||
@@ -1,20 +1,23 @@
|
||||
import * as path from "path";
|
||||
// import * as url from "url";
|
||||
import { getPermalinks } from "../src/utils";
|
||||
|
||||
// const __dirname = url.fileURLToPath(new URL(".", import.meta.url));
|
||||
// const markdownFolder = path.join(__dirname, "/fixtures/content");
|
||||
const markdownFolder = path.join(
|
||||
".",
|
||||
"test/fixtures/content"
|
||||
"/packages/remark-wiki-link/test/fixtures/content"
|
||||
);
|
||||
|
||||
describe("getPermalinks", () => {
|
||||
test("should return an array of permalinks", () => {
|
||||
const expectedPermalinks = [
|
||||
"/README",
|
||||
"/", // /index.md
|
||||
"/abc",
|
||||
"/blog/first-post",
|
||||
"/blog/Second Post",
|
||||
"/blog/third-post",
|
||||
"/blog/README",
|
||||
"/blog", // /blog/index.md
|
||||
"/blog/tutorials/first-tutorial",
|
||||
"/assets/Pasted Image 123.png",
|
||||
];
|
||||
@@ -25,4 +28,35 @@ describe("getPermalinks", () => {
|
||||
expect(expectedPermalinks).toContain(permalink);
|
||||
});
|
||||
});
|
||||
|
||||
test("should return an array of permalinks with custom path -> permalink converter function", () => {
|
||||
const expectedPermalinks = [
|
||||
"/", // /index.md
|
||||
"/abc",
|
||||
"/blog/first-post",
|
||||
"/blog/second-post",
|
||||
"/blog/third-post",
|
||||
"/blog", // /blog/index.md
|
||||
"/blog/tutorials/first-tutorial",
|
||||
"/assets/pasted-image-123.png",
|
||||
];
|
||||
|
||||
const func = (filePath: string, markdownFolder: string) => {
|
||||
const permalink = filePath
|
||||
.replace(markdownFolder, "") // make the permalink relative to the markdown folder
|
||||
.replace(/\.(mdx|md)/, "")
|
||||
.replace(/\\/g, "/") // replace windows backslash with forward slash
|
||||
.replace(/\/index$/, "") // remove index from the end of the permalink
|
||||
.replace(/ /g, "-") // replace spaces with hyphens
|
||||
.toLowerCase(); // convert to lowercase
|
||||
|
||||
return permalink.length > 0 ? permalink : "/"; // for home page
|
||||
};
|
||||
|
||||
const permalinks = getPermalinks(markdownFolder, [/\.DS_Store/], func);
|
||||
expect(permalinks).toHaveLength(expectedPermalinks.length);
|
||||
permalinks.forEach((permalink) => {
|
||||
expect(expectedPermalinks).toContain(permalink);
|
||||
});
|
||||
});
|
||||
});
|
||||
|
||||
@@ -48,7 +48,7 @@ describe("micromark-extension-wiki-link", () => {
|
||||
html({
|
||||
permalinks: ["/some/folder/Wiki Link"],
|
||||
pathFormat: "obsidian-short",
|
||||
}) as any, // TODO type fix
|
||||
}) as any // TODO type fix
|
||||
],
|
||||
});
|
||||
expect(serialized).toBe(
|
||||
@@ -75,7 +75,7 @@ describe("micromark-extension-wiki-link", () => {
|
||||
html({
|
||||
permalinks: ["/some/folder/Wiki Link"],
|
||||
pathFormat: "obsidian-absolute",
|
||||
}) as any, // TODO type fix
|
||||
}) as any // TODO type fix
|
||||
],
|
||||
});
|
||||
expect(serialized).toBe(
|
||||
@@ -97,14 +97,10 @@ describe("micromark-extension-wiki-link", () => {
|
||||
});
|
||||
|
||||
test("parses a wiki link with heading and alias", () => {
|
||||
const serialized = micromark(
|
||||
"[[Wiki Link#Some Heading|Alias]]",
|
||||
"ascii",
|
||||
{
|
||||
extensions: [syntax()],
|
||||
htmlExtensions: [html() as any], // TODO type fix
|
||||
}
|
||||
);
|
||||
const serialized = micromark("[[Wiki Link#Some Heading|Alias]]", "ascii", {
|
||||
extensions: [syntax()],
|
||||
htmlExtensions: [html() as any], // TODO type fix
|
||||
});
|
||||
// note: lowercased and hyphenated heading
|
||||
expect(serialized).toBe(
|
||||
'<p><a href="Wiki Link#some-heading" class="internal new">Alias</a></p>'
|
||||
@@ -138,7 +134,7 @@ describe("micromark-extension-wiki-link", () => {
|
||||
extensions: [syntax()],
|
||||
htmlExtensions: [html() as any], // TODO type fix
|
||||
});
|
||||
expect(serialized).toBe('<p>![[My Image.xyz]]</p>');
|
||||
expect(serialized).toBe("<p>![[My Image.xyz]]</p>");
|
||||
});
|
||||
|
||||
test("parses and image ambed with a matching permalink", () => {
|
||||
@@ -151,28 +147,6 @@ describe("micromark-extension-wiki-link", () => {
|
||||
);
|
||||
});
|
||||
|
||||
// TODO: Fix alt attribute
|
||||
test("Can identify the dimensions of the image if exists", () => {
|
||||
const serialized = micromark("![[My Image.jpg|200]]", "ascii", {
|
||||
extensions: [syntax()],
|
||||
htmlExtensions: [html({ permalinks: ["My Image.jpg"] }) as any], // TODO type fix
|
||||
});
|
||||
expect(serialized).toBe(
|
||||
'<p><img src="My Image.jpg" alt="My Image.jpg" class="internal" width="200" height="200" /></p>'
|
||||
);
|
||||
});
|
||||
|
||||
// TODO: Fix alt attribute
|
||||
test("Can identify the dimensions of the image if exists", () => {
|
||||
const serialized = micromark("![[My Image.jpg|200x200]]", "ascii", {
|
||||
extensions: [syntax()],
|
||||
htmlExtensions: [html({ permalinks: ["My Image.jpg"] }) as any], // TODO type fix
|
||||
});
|
||||
expect(serialized).toBe(
|
||||
'<p><img src="My Image.jpg" alt="My Image.jpg" class="internal" width="200" height="200" /></p>'
|
||||
);
|
||||
});
|
||||
|
||||
test("parses an image embed with a matching permalink and Obsidian-style shortedned path", () => {
|
||||
const serialized = micromark("![[My Image.jpg]]", {
|
||||
extensions: [syntax()],
|
||||
@@ -180,7 +154,7 @@ describe("micromark-extension-wiki-link", () => {
|
||||
html({
|
||||
permalinks: ["/assets/My Image.jpg"],
|
||||
pathFormat: "obsidian-short",
|
||||
}) as any, // TODO type fix
|
||||
}) as any // TODO type fix
|
||||
],
|
||||
});
|
||||
expect(serialized).toBe(
|
||||
@@ -215,7 +189,7 @@ describe("micromark-extension-wiki-link", () => {
|
||||
extensions: [syntax()],
|
||||
htmlExtensions: [html() as any], // TODO type fix
|
||||
});
|
||||
expect(serialized).toBe('<p>[[Wiki Link</p>');
|
||||
expect(serialized).toBe("<p>[[Wiki Link</p>");
|
||||
});
|
||||
|
||||
test("doesn't parse a wiki link with one missing closing bracket", () => {
|
||||
@@ -223,7 +197,7 @@ describe("micromark-extension-wiki-link", () => {
|
||||
extensions: [syntax()],
|
||||
htmlExtensions: [html() as any], // TODO type fix
|
||||
});
|
||||
expect(serialized).toBe('<p>[[Wiki Link]</p>');
|
||||
expect(serialized).toBe("<p>[[Wiki Link]</p>");
|
||||
});
|
||||
|
||||
test("doesn't parse a wiki link with a missing opening bracket", () => {
|
||||
@@ -231,7 +205,7 @@ describe("micromark-extension-wiki-link", () => {
|
||||
extensions: [syntax()],
|
||||
htmlExtensions: [html() as any], // TODO type fix
|
||||
});
|
||||
expect(serialized).toBe('<p>[Wiki Link]]</p>');
|
||||
expect(serialized).toBe("<p>[Wiki Link]]</p>");
|
||||
});
|
||||
|
||||
test("doesn't parse a wiki link in single brackets", () => {
|
||||
@@ -239,7 +213,7 @@ describe("micromark-extension-wiki-link", () => {
|
||||
extensions: [syntax()],
|
||||
htmlExtensions: [html() as any], // TODO type fix
|
||||
});
|
||||
expect(serialized).toBe('<p>[Wiki Link]</p>');
|
||||
expect(serialized).toBe("<p>[Wiki Link]</p>");
|
||||
});
|
||||
});
|
||||
|
||||
@@ -251,7 +225,7 @@ describe("micromark-extension-wiki-link", () => {
|
||||
html({
|
||||
newClassName: "test-new",
|
||||
wikiLinkClassName: "test-wiki-link",
|
||||
}) as any, // TODO type fix
|
||||
}) as any // TODO type fix
|
||||
],
|
||||
});
|
||||
expect(serialized).toBe(
|
||||
@@ -277,7 +251,7 @@ describe("micromark-extension-wiki-link", () => {
|
||||
wikiLinkResolver: (page) => [
|
||||
page.replace(/\s+/, "-").toLowerCase(),
|
||||
],
|
||||
}) as any, // TODO type fix
|
||||
}) as any // TODO type fix
|
||||
],
|
||||
});
|
||||
expect(serialized).toBe(
|
||||
@@ -286,6 +260,56 @@ describe("micromark-extension-wiki-link", () => {
|
||||
});
|
||||
});
|
||||
|
||||
test("parses wiki links to index files", () => {
|
||||
const serialized = micromark("[[/some/folder/index]]", "ascii", {
|
||||
extensions: [syntax()],
|
||||
htmlExtensions: [html() as any], // TODO type fix
|
||||
});
|
||||
expect(serialized).toBe(
|
||||
'<p><a href="/some/folder" class="internal new">/some/folder/index</a></p>'
|
||||
);
|
||||
});
|
||||
|
||||
describe("other", () => {
|
||||
test("parses a wiki link to some index page in a folder with no matching permalink", () => {
|
||||
const serialized = micromark("[[/some/folder/index]]", "ascii", {
|
||||
extensions: [syntax()],
|
||||
htmlExtensions: [html() as any], // TODO type fix
|
||||
});
|
||||
expect(serialized).toBe(
|
||||
'<p><a href="/some/folder" class="internal new">/some/folder/index</a></p>'
|
||||
);
|
||||
});
|
||||
|
||||
test("parses a wiki link to some index page in a folder with a matching permalink", () => {
|
||||
const serialized = micromark("[[/some/folder/index]]", "ascii", {
|
||||
extensions: [syntax()],
|
||||
htmlExtensions: [html({ permalinks: ["/some/folder"] }) as any], // TODO type fix
|
||||
});
|
||||
expect(serialized).toBe(
|
||||
'<p><a href="/some/folder" class="internal">/some/folder/index</a></p>'
|
||||
);
|
||||
});
|
||||
|
||||
test("parses a wiki link to home index page with no matching permalink", () => {
|
||||
const serialized = micromark("[[/index]]", "ascii", {
|
||||
extensions: [syntax()],
|
||||
htmlExtensions: [html() as any], // TODO type fix
|
||||
});
|
||||
expect(serialized).toBe(
|
||||
'<p><a href="/" class="internal new">/index</a></p>'
|
||||
);
|
||||
});
|
||||
|
||||
test("parses a wiki link to home index page with a matching permalink", () => {
|
||||
const serialized = micromark("[[/index]]", "ascii", {
|
||||
extensions: [syntax()],
|
||||
htmlExtensions: [html({ permalinks: ["/"] }) as any], // TODO type fix
|
||||
});
|
||||
expect(serialized).toBe('<p><a href="/" class="internal">/index</a></p>');
|
||||
});
|
||||
});
|
||||
|
||||
describe("transclusions", () => {
|
||||
test("parsers a transclusion as a regular wiki link", () => {
|
||||
const serialized = micromark("![[Some Page]]", "ascii", {
|
||||
@@ -297,14 +321,4 @@ describe("micromark-extension-wiki-link", () => {
|
||||
);
|
||||
});
|
||||
});
|
||||
|
||||
describe("Links with special characters", () => {
|
||||
test("parses a link with special characters and symbols", () => {
|
||||
const serialized = micromark("[[li nk-w(i)th-àcèô íã_a(n)d_underline!:ª%@'*º$ °~./\\#LI NK-W(i)th-àcèô íã_a(n)d_uNdErlinE!:ª%@'*º$ °~./\\]]", "ascii", {
|
||||
extensions: [syntax()],
|
||||
htmlExtensions: [html() as any],
|
||||
});
|
||||
expect(serialized).toBe(`<p><a href="li nk-w(i)th-àcèô íã_a(n)d_underline!:ª%@'*º$ °~./\\#li-nk-w(i)th-àcèô-íã_a(n)d_underline!:ª%@'*º$-°~./\\" class="internal new">li nk-w(i)th-àcèô íã_a(n)d_underline!:ª%@'*º$ °~./\\#LI NK-W(i)th-àcèô íã_a(n)d_uNdErlinE!:ª%@'*º$ °~./\\</a></p>`);
|
||||
});
|
||||
});
|
||||
});
|
||||
|
||||
@@ -246,28 +246,6 @@ describe("remark-wiki-link", () => {
|
||||
expect(node.data?.hName).toEqual("img");
|
||||
expect((node.data?.hProperties as any).src).toEqual("My Image.png");
|
||||
expect((node.data?.hProperties as any).alt).toEqual("My Image.png");
|
||||
expect((node.data?.hProperties as any).width).toBeUndefined();
|
||||
expect((node.data?.hProperties as any).height).toBeUndefined();
|
||||
});
|
||||
});
|
||||
|
||||
test("Can identify the dimensions of the image if exists", () => {
|
||||
const processor = unified().use(markdown).use(wikiLinkPlugin);
|
||||
|
||||
let ast = processor.parse("![[My Image.png|132x612]]");
|
||||
ast = processor.runSync(ast);
|
||||
|
||||
expect(select("wikiLink", ast)).not.toEqual(null);
|
||||
|
||||
visit(ast, "wikiLink", (node: Node) => {
|
||||
expect(node.data?.isEmbed).toEqual(true);
|
||||
expect(node.data?.target).toEqual("My Image.png");
|
||||
expect(node.data?.permalink).toEqual("My Image.png");
|
||||
expect(node.data?.hName).toEqual("img");
|
||||
expect((node.data?.hProperties as any).src).toEqual("My Image.png");
|
||||
expect((node.data?.hProperties as any).alt).toEqual("My Image.png");
|
||||
expect((node.data?.hProperties as any).width).toBe("132");
|
||||
expect((node.data?.hProperties as any).height).toBe("612");
|
||||
});
|
||||
});
|
||||
|
||||
@@ -383,36 +361,6 @@ describe("remark-wiki-link", () => {
|
||||
});
|
||||
});
|
||||
|
||||
describe("Links with special characters", () => {
|
||||
test("parses a link with special characters and symbols", () => {
|
||||
const processor = unified().use(markdown).use(wikiLinkPlugin);
|
||||
|
||||
let ast = processor.parse(
|
||||
"[[li nk-w(i)th-àcèô íã_a(n)d_underline!:ª%@'*º$ °~./\\#li-nk-w(i)th-àcèô íã_a(n)D_UNDERLINE!:ª%@'*º$ °~./\\]]"
|
||||
);
|
||||
ast = processor.runSync(ast);
|
||||
expect(select("wikiLink", ast)).not.toEqual(null);
|
||||
|
||||
visit(ast, "wikiLink", (node: Node) => {
|
||||
expect(node.data?.exists).toEqual(false);
|
||||
expect(node.data?.permalink).toEqual(
|
||||
"li nk-w(i)th-àcèô íã_a(n)d_underline!:ª%@'*º$ °~./\\"
|
||||
);
|
||||
expect(node.data?.alias).toEqual(null);
|
||||
expect(node.data?.hName).toEqual("a");
|
||||
expect((node.data?.hProperties as any).className).toEqual(
|
||||
"internal new"
|
||||
);
|
||||
expect((node.data?.hProperties as any).href).toEqual(
|
||||
"li nk-w(i)th-àcèô íã_a(n)d_underline!:ª%@'*º$ °~./\\#li-nk-w(i)th-àcèô-íã_a(n)d_underline!:ª%@'*º$-°~./\\"
|
||||
);
|
||||
expect((node.data?.hChildren as any)[0].value).toEqual(
|
||||
"li nk-w(i)th-àcèô íã_a(n)d_underline!:ª%@'*º$ °~./\\#li-nk-w(i)th-àcèô íã_a(n)D_UNDERLINE!:ª%@'*º$ °~./\\"
|
||||
);
|
||||
});
|
||||
});
|
||||
});
|
||||
|
||||
describe("invalid wiki links", () => {
|
||||
test("doesn't parse a wiki link with two missing closing brackets", () => {
|
||||
const processor = unified().use(markdown).use(wikiLinkPlugin);
|
||||
@@ -485,6 +433,109 @@ describe("remark-wiki-link", () => {
|
||||
});
|
||||
});
|
||||
|
||||
test("parses wiki links to index files", () => {
|
||||
const processor = unified().use(markdown).use(wikiLinkPlugin);
|
||||
|
||||
let ast = processor.parse("[[/some/folder/index]]");
|
||||
ast = processor.runSync(ast);
|
||||
|
||||
expect(select("wikiLink", ast)).not.toEqual(null);
|
||||
|
||||
visit(ast, "wikiLink", (node: Node) => {
|
||||
expect(node.data?.exists).toEqual(false);
|
||||
expect(node.data?.permalink).toEqual("/some/folder");
|
||||
expect(node.data?.alias).toEqual(null);
|
||||
expect(node.data?.hName).toEqual("a");
|
||||
expect((node.data?.hProperties as any).className).toEqual("internal new");
|
||||
expect((node.data?.hProperties as any).href).toEqual("/some/folder");
|
||||
expect((node.data?.hChildren as any)[0].value).toEqual(
|
||||
"/some/folder/index"
|
||||
);
|
||||
});
|
||||
});
|
||||
|
||||
describe("other", () => {
|
||||
test("parses a wiki link to some index page in a folder with no matching permalink", () => {
|
||||
const processor = unified().use(markdown).use(wikiLinkPlugin);
|
||||
|
||||
let ast = processor.parse("[[/some/folder/index]]");
|
||||
ast = processor.runSync(ast);
|
||||
|
||||
visit(ast, "wikiLink", (node: Node) => {
|
||||
expect(node.data?.exists).toEqual(false);
|
||||
expect(node.data?.permalink).toEqual("/some/folder");
|
||||
expect(node.data?.alias).toEqual(null);
|
||||
expect(node.data?.hName).toEqual("a");
|
||||
expect((node.data?.hProperties as any).className).toEqual(
|
||||
"internal new"
|
||||
);
|
||||
expect((node.data?.hProperties as any).href).toEqual("/some/folder");
|
||||
expect((node.data?.hChildren as any)[0].value).toEqual(
|
||||
"/some/folder/index"
|
||||
);
|
||||
});
|
||||
});
|
||||
|
||||
test("parses a wiki link to some index page in a folder with a matching permalink", () => {
|
||||
const processor = unified()
|
||||
.use(markdown)
|
||||
.use(wikiLinkPlugin, { permalinks: ["/some/folder"] });
|
||||
|
||||
let ast = processor.parse("[[/some/folder/index]]");
|
||||
ast = processor.runSync(ast);
|
||||
|
||||
visit(ast, "wikiLink", (node: Node) => {
|
||||
expect(node.data?.exists).toEqual(true);
|
||||
expect(node.data?.permalink).toEqual("/some/folder");
|
||||
expect(node.data?.alias).toEqual(null);
|
||||
expect(node.data?.hName).toEqual("a");
|
||||
expect((node.data?.hProperties as any).className).toEqual("internal");
|
||||
expect((node.data?.hProperties as any).href).toEqual("/some/folder");
|
||||
expect((node.data?.hChildren as any)[0].value).toEqual(
|
||||
"/some/folder/index"
|
||||
);
|
||||
});
|
||||
});
|
||||
|
||||
test("parses a wiki link to home index page with no matching permalink", () => {
|
||||
const processor = unified().use(markdown).use(wikiLinkPlugin);
|
||||
|
||||
let ast = processor.parse("[[/index]]");
|
||||
ast = processor.runSync(ast);
|
||||
|
||||
visit(ast, "wikiLink", (node: Node) => {
|
||||
expect(node.data?.exists).toEqual(false);
|
||||
expect(node.data?.permalink).toEqual("/");
|
||||
expect(node.data?.alias).toEqual(null);
|
||||
expect(node.data?.hName).toEqual("a");
|
||||
expect((node.data?.hProperties as any).className).toEqual(
|
||||
"internal new"
|
||||
);
|
||||
expect((node.data?.hProperties as any).href).toEqual("/");
|
||||
expect((node.data?.hChildren as any)[0].value).toEqual("/index");
|
||||
});
|
||||
});
|
||||
|
||||
test("parses a wiki link to home index page with a matching permalink", () => {
|
||||
const processor = unified()
|
||||
.use(markdown)
|
||||
.use(wikiLinkPlugin, { permalinks: ["/"] });
|
||||
|
||||
let ast = processor.parse("[[/index]]");
|
||||
ast = processor.runSync(ast);
|
||||
|
||||
visit(ast, "wikiLink", (node: Node) => {
|
||||
expect(node.data?.exists).toEqual(true);
|
||||
expect(node.data?.permalink).toEqual("/");
|
||||
expect(node.data?.alias).toEqual(null);
|
||||
expect(node.data?.hName).toEqual("a");
|
||||
expect((node.data?.hProperties as any).className).toEqual("internal");
|
||||
expect((node.data?.hProperties as any).href).toEqual("/");
|
||||
expect((node.data?.hChildren as any)[0].value).toEqual("/index");
|
||||
});
|
||||
});
|
||||
});
|
||||
|
||||
describe("transclusions", () => {
|
||||
test("replaces a transclusion with a regular wiki link", () => {
|
||||
const processor = unified().use(markdown).use(wikiLinkPlugin);
|
||||
|
||||
@@ -12,7 +12,7 @@ export default function JSONLD({
|
||||
return <></>;
|
||||
}
|
||||
|
||||
const baseUrl = process.env.NEXT_PUBLIC_SITE_URL || 'https://portaljs.com';
|
||||
const baseUrl = process.env.NEXT_PUBLIC_SITE_URL || 'https://portaljs.org';
|
||||
const pageUrl = `${baseUrl}/${meta.urlPath}`;
|
||||
|
||||
const imageMatches = source.match(
|
||||
|
||||
@@ -81,6 +81,7 @@ export default function Layout({
|
||||
}
|
||||
return section.children.findIndex(isActive) > -1;
|
||||
}
|
||||
|
||||
return (
|
||||
<>
|
||||
{title && <NextSeo title={title} description={description} />}
|
||||
|
||||
@@ -22,41 +22,11 @@ const items = [
|
||||
sourceUrl: 'https://github.com/FCSCOpendata/frontend',
|
||||
},
|
||||
{
|
||||
title: 'Frictionless Data',
|
||||
href: 'https://datahub.io/core/co2-ppm',
|
||||
repository: 'https://github.com/datopian/datahub/tree/main/examples/dataset-frictionless',
|
||||
image: '/images/showcases/frictionless-capture.png',
|
||||
description: 'Progressive open-source framework for building data infrastructure - data management, data integration, data flows, etc. It includes various data standards and provides software to work with data.',
|
||||
title: 'Datahub Open Data',
|
||||
href: 'https://opendata.datahub.io/',
|
||||
image: '/images/showcases/datahub.webp',
|
||||
description: 'Demo Data Portal by DataHub',
|
||||
},
|
||||
{
|
||||
title: "OpenSpending",
|
||||
image: "/images/showcases/openspending.png",
|
||||
href: "https://www.openspending.org",
|
||||
repository: 'https://github.com/datopian/datahub/tree/main/examples/openspending',
|
||||
description: "OpenSpending is a free, open and global platform to search, visualise and analyse fiscal data in the public sphere."
|
||||
},
|
||||
{
|
||||
title: "FiveThirtyEight",
|
||||
image: "/images/showcases/fivethirtyeight.png",
|
||||
href: "https://fivethirtyeight.portaljs.org/",
|
||||
repository: 'https://github.com/datopian/datahub/tree/main/examples/fivethirtyeight',
|
||||
description: "This is a replica of data.fivethirtyeight.com using PortalJS."
|
||||
},
|
||||
{
|
||||
title: "Github Datasets",
|
||||
image: "/images/showcases/github-datasets.png",
|
||||
href: "https://example.portaljs.org/",
|
||||
repository: 'https://github.com/datopian/datahub/tree/main/examples/github-backed-catalog',
|
||||
description: "A simple data catalog that get its data from a list of GitHub repos that serve as datasets."
|
||||
},
|
||||
{
|
||||
title: "Hatespeech Data",
|
||||
image: "/images/showcases/turing.png",
|
||||
href: "https://hatespeechdata.com/",
|
||||
repository: 'https://github.com/datopian/datahub/tree/main/examples/turing',
|
||||
description: "Datasets annotated for hate speech, online abuse, and offensive language which are useful for training a natural language processing system to detect this online abuse."
|
||||
},
|
||||
|
||||
];
|
||||
|
||||
export default function Showcases() {
|
||||
|
||||
@@ -1,6 +1,10 @@
|
||||
export default function ShowcasesItem({ item }) {
|
||||
return (
|
||||
<div className="rounded overflow-hidden group relative border-1 shadow-lg">
|
||||
<a
|
||||
className="rounded overflow-hidden group relative border-1 shadow-lg"
|
||||
target="_blank"
|
||||
href={item.href}
|
||||
>
|
||||
<div
|
||||
className="bg-cover bg-no-repeat bg-top aspect-video w-full group-hover:blur-sm group-hover:scale-105 transition-all duration-200"
|
||||
style={{ backgroundImage: `url(${item.image})` }}
|
||||
@@ -12,48 +16,9 @@ export default function ShowcasesItem({ item }) {
|
||||
<div className="text-center text-primary-dark">
|
||||
<span className="text-xl font-semibold">{item.title}</span>
|
||||
<p className="text-base font-medium">{item.description}</p>
|
||||
<div className="flex justify-center mt-2 gap-2 ">
|
||||
{item.href && (
|
||||
<a
|
||||
target="_blank"
|
||||
className=" text-white w-8 h-8 p-1 bg-primary rounded-full hover:scale-110 transition cursor-pointer z-50"
|
||||
rel="noreferrer"
|
||||
href={item.href}
|
||||
>
|
||||
<svg
|
||||
xmlns="http://www.w3.org/2000/svg"
|
||||
viewBox="0 0 420 420"
|
||||
stroke="white"
|
||||
fill="none"
|
||||
>
|
||||
<path stroke-width="26" d="M209,15a195,195 0 1,0 2,0z" />
|
||||
<path
|
||||
stroke-width="18"
|
||||
d="m210,15v390m195-195H15M59,90a260,260 0 0,0 302,0 m0,240 a260,260 0 0,0-302,0M195,20a250,250 0 0,0 0,382 m30,0 a250,250 0 0,0 0-382"
|
||||
/>
|
||||
</svg>
|
||||
</a>
|
||||
)}
|
||||
{item.repository && (
|
||||
<a
|
||||
target="_blank"
|
||||
rel="noreferrer"
|
||||
className="w-8 h-8 bg-black rounded-full p-1 hover:scale-110 transition cursor-pointer z-50"
|
||||
href={item.repository}
|
||||
>
|
||||
<svg
|
||||
aria-hidden="true"
|
||||
viewBox="0 0 16 16"
|
||||
fill="currentColor"
|
||||
>
|
||||
<path d="M8 0C3.58 0 0 3.58 0 8C0 11.54 2.29 14.53 5.47 15.59C5.87 15.66 6.02 15.42 6.02 15.21C6.02 15.02 6.01 14.39 6.01 13.72C4 14.09 3.48 13.23 3.32 12.78C3.23 12.55 2.84 11.84 2.5 11.65C2.22 11.5 1.82 11.13 2.49 11.12C3.12 11.11 3.57 11.7 3.72 11.94C4.44 13.15 5.59 12.81 6.05 12.6C6.12 12.08 6.33 11.73 6.56 11.53C4.78 11.33 2.92 10.64 2.92 7.58C2.92 6.71 3.23 5.99 3.74 5.43C3.66 5.23 3.38 4.41 3.82 3.31C3.82 3.31 4.49 3.1 6.02 4.13C6.66 3.95 7.34 3.86 8.02 3.86C8.7 3.86 9.38 3.95 10.02 4.13C11.55 3.09 12.22 3.31 12.22 3.31C12.66 4.41 12.38 5.23 12.3 5.43C12.81 5.99 13.12 6.7 13.12 7.58C13.12 10.65 11.25 11.33 9.47 11.53C9.76 11.78 10.01 12.26 10.01 13.01C10.01 14.08 10 14.94 10 15.21C10 15.42 10.15 15.67 10.55 15.59C13.71 14.53 16 11.53 16 8C16 3.58 12.42 0 8 0Z" />
|
||||
</svg>
|
||||
</a>
|
||||
)}
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</a>
|
||||
);
|
||||
}
|
||||
|
||||
Binary file not shown.
|
Before Width: | Height: | Size: 96 KiB |
Binary file not shown.
|
Before Width: | Height: | Size: 65 KiB |
Binary file not shown.
|
Before Width: | Height: | Size: 144 KiB |
@@ -1,23 +1,21 @@
|
||||
---
|
||||
title: 'Adding Maps to PortalJS: Enhancing Geospatial Data Visualization with PortalJS'
|
||||
title: 'Enhancing Geospatial Data Visualization with PortalJS'
|
||||
date: 2023-07-18
|
||||
authors: ['João Demenech', 'Luccas Mateus', 'Yoana Popova']
|
||||
filetype: 'blog'
|
||||
---
|
||||
|
||||
This post walks you though adding maps and geospatial visualizations to PortalJS.
|
||||
Are you keen on building rich and interactive data portals? Do you find value in the power and flexibility of JavaScript, Nextjs, and React? In that case, allow us to introduce you to [PortalJS](https://portaljs.org/), a state-of-the-art framework leveraging these technologies to help you build amazing data portals.
|
||||
|
||||
Are you interested in building rich and interactive data portals? Do you find value in the power and flexibility of JavaScript, Nextjs, and React? If so, [PortalJS](https://portaljs.com/) is for you. It's a state-of-the-art framework leveraging these technologies to help you build rich data portals.
|
||||
Perhaps you already understand that the effective data visualization lies in the adept utilization of various data components. Within [PortalJS](https://portaljs.org/), we take data visualization a step further. It's not just about displaying data - it's about telling a captivating story through the strategic orchestration of a diverse array of data components.
|
||||
|
||||
Effective data visualization lies in the use of various data components. Within [PortalJS](https://portaljs.com/), we take data visualization a step further. It's not just about displaying data - it's about telling a story through combining a variety of data components.
|
||||
|
||||
In this post we will share our latest enhancement to PortalJS: maps, a powerful tool for visualizing geospatial data. In this post, we will to take you on a tour of our experiments and progress in enhancing map functionalities on PortalJS. The journey is still in its early stages, with new facets being unveiled and refined as we perfect our API.
|
||||
We are now eager to share our latest enhancement to [PortalJS](https://portaljs.org/): maps, a powerful tool for visualizing geospatial data. In this post, we will to take you on a tour of our experiments and progress in enhancing map functionalities on [PortalJS](https://portaljs.org/). Our journey into this innovative feature is still in its early stages, with new facets being unveiled and refined as we perfect our API. Still, this exciting development opens a new avenue for visualizing data, enhancing your ability to convey complex geospatial information with clarity and precision.
|
||||
|
||||
## Exploring Map Formats
|
||||
|
||||
Maps play a crucial role in geospatial data visualization. Several formats exist for storing and sharing this type of data, with GeoJSON, KML, and shapefiles being among the most popular. As a prominent figure in the field of open-source data portal platforms, [PortalJS](https://portaljs.com/) strives to support as many map formats as possible.
|
||||
Maps play a crucial role in geospatial data visualization. Several formats exist for storing and sharing this type of data, with GeoJSON, KML, and shapefiles being among the most popular. As a prominent figure in the field of open-source data portal platforms, [PortalJS](https://portaljs.org/) strives to support as many map formats as possible.
|
||||
|
||||
Taking inspiration from the ckanext-geoview extension, we currently support KML and GeoJSON formats in [PortalJS](https://portaljs.com/). This remarkable extension is a plugin for CKAN, the world’s leading open source data management system, that enables users to visualize geospatial data in diverse formats on an interactive map. Apart from KML and GeoJSON formats support, our roadmap entails extending compatibility to encompass all other formats supported by ckanext-geoview. Rest assured, we are committed to empowering users with a wide array of map format options in the future.
|
||||
Taking inspiration from the ckanext-geoview extension, we currently support KML and GeoJSON formats in [PortalJS](https://portaljs.org/). This remarkable extension is a plugin for CKAN, the world’s leading open source data management system, that enables users to visualize geospatial data in diverse formats on an interactive map. Apart from KML and GeoJSON formats support, our roadmap entails extending compatibility to encompass all other formats supported by ckanext-geoview. Rest assured, we are committed to empowering users with a wide array of map format options in the future.
|
||||
|
||||
So, what makes these formats special?
|
||||
|
||||
@@ -27,7 +25,7 @@ So, what makes these formats special?
|
||||
|
||||
## Unveiling the Power of Leaflet and OpenLayers
|
||||
|
||||
To display maps in [PortalJS](https://portaljs.com/), we utilize two powerful JavaScript libraries for creating interactive maps based on different layers: Leaflet and OpenLayers. Each offers distinct advantages (and disadvantages), inspiring us to integrate both and give users the flexibility to choose.
|
||||
To display maps in [PortalJS](https://portaljs.org/), we utilize two powerful JavaScript libraries for creating interactive maps based on different layers: Leaflet and OpenLayers. Each offers distinct advantages (and disadvantages), inspiring us to integrate both and give users the flexibility to choose.
|
||||
|
||||
Leaflet is the leading open-source JavaScript library known for its mobile-friendly, interactive maps. With its compact size (just 42 KB of JS), it provides all the map features most developers need. Leaflet is designed with simplicity, performance and usability in mind. It works efficiently across all major desktop and mobile platforms.
|
||||
|
||||
@@ -59,8 +57,8 @@ Users can also choose a region of focus, which will depend on the data, by setti
|
||||
|
||||
Through our ongoing enhancements to the [PortalJS library](https://storybook.portaljs.org/), we aim to empower users to create engaging and informative data portals featuring diverse map formats and data components.
|
||||
|
||||
Why not give [PortalJS](https://portaljs.com/) a try today and discover the possibilities for your own data portals? To get started, check out our comprehensive documentation here: [PortalJS Documentation](https://portaljs.com/opensource).
|
||||
Why not give [PortalJS](https://portaljs.org/) a try today and discover the possibilities for your own data portals? To get started, check out our comprehensive documentation here: [PortalJS Documentation](https://portaljs.org/docs).
|
||||
|
||||
Have questions or comments about using [PortalJS](https://portaljs.com/) for your data portals? Feel free to share your thoughts on our [Discord channel](https://discord.com/invite/EeyfGrGu4U). We're here to help you make the most of your data.
|
||||
Have questions or comments about using [PortalJS](https://portaljs.org/) for your data portals? Feel free to share your thoughts on our [Discord channel](https://discord.com/invite/EeyfGrGu4U). We're here to help you make the most of your data.
|
||||
|
||||
Stay tuned for more exciting developments as we continue to enhance [PortalJS](https://portaljs.com/)!
|
||||
Stay tuned for more exciting developments as we continue to enhance [PortalJS](https://portaljs.org/)!
|
||||
|
||||
@@ -4,7 +4,7 @@ authors: ['Luccas Mateus']
|
||||
date: 2021-04-20
|
||||
---
|
||||
|
||||
We have created a full data portal demo using DataHub PortalJS all backed by a CKAN instance storing data and metadata, you can see below a screenshot of the homepage and of an individual dataset page.
|
||||
We have created a full data portal demo using PortalJS all backed by a CKAN instance storing data and metadata, you can see below a screenshot of the homepage and of an individual dataset page.
|
||||
|
||||

|
||||

|
||||
@@ -14,7 +14,7 @@ We have created a full data portal demo using DataHub PortalJS all backed by a C
|
||||
To create a Portal app, run the following command in your terminal:
|
||||
|
||||
```console
|
||||
npx create-next-app -e https://github.com/datopian/datahub/tree/main/examples/ckan
|
||||
npx create-next-app -e https://github.com/datopian/portaljs/tree/main/examples/ckan
|
||||
```
|
||||
|
||||
> NB: Under the hood, this uses the tool called create-next-app, which bootstraps an app for you based on our CKAN example.
|
||||
|
||||
@@ -3,7 +3,6 @@ title: 'Announcing MarkdownDB: an open source tool to create an SQL API to your
|
||||
description: MarkdownDB - an open source library to transform markdown content into sql-queryable data. Build rich markdown-powered sites easily and reliably. New dedicated website at markdowndb.com
|
||||
date: 2023-10-11
|
||||
authors: ['Ola Rubaj']
|
||||
filetype: blog
|
||||
---
|
||||
|
||||
Hello, dear readers!
|
||||
|
||||
@@ -2,7 +2,6 @@
|
||||
title: What We Shipped in Jul-Aug 2023
|
||||
authors: ['ola-rubaj']
|
||||
date: 2023-09-2
|
||||
filetype: blog
|
||||
---
|
||||
|
||||
Hey everyone! 👋 Summer has been in full swing, and while I've managed to catch some vacation vibes, I've also been deep into code. I'm super excited to share some of the latest updates and features we've rolled out over the past two months. Let's dive in:
|
||||
@@ -30,12 +29,12 @@ https://github.com/datopian/markdowndb
|
||||
|
||||
## 📚 The Guide
|
||||
|
||||
https://portaljs.com/opensource
|
||||
https://portaljs.org/guide
|
||||
|
||||
I’ve sketched overviews for two upcoming tutorials:
|
||||
|
||||
1. **Collaborating with others on your website**: Learn how to make your website projects a team effort. [See it here](https://portaljs.com/guide#tutorial-3-collaborating-with-others-on-your-website-project)
|
||||
2. **Customising your website and previewing your changes locally**: Customize and preview your site changes locally, without headaches. [See it here](https://portaljs.com/guide#tutorial-4-customising-your-website-locally-and-previewing-your-changes-locally)
|
||||
1. **Collaborating with others on your website**: Learn how to make your website projects a team effort. [See it here](https://portaljs.org/guide#tutorial-3-collaborating-with-others-on-your-website-project)
|
||||
2. **Customising your website and previewing your changes locally**: Customize and preview your site changes locally, without headaches. [See it here](https://portaljs.org/guide#tutorial-4-customising-your-website-locally-and-previewing-your-changes-locally)
|
||||
|
||||
## 🌐 LifeItself.org
|
||||
|
||||
|
||||
@@ -1,34 +0,0 @@
|
||||
---
|
||||
title: 'The OpenSpending Revamp: Behind the Scenes'
|
||||
date: 2023-10-13
|
||||
authors: ['Luccas Mateus', 'João Demenech']
|
||||
filetype: 'blog'
|
||||
---
|
||||
|
||||
_This post was originally published on [the Datopian blog](http://datopian.com/blog/the-open-spending-revamp-behind-the-scenes)._
|
||||
|
||||
In our last article, we explored [the Open Spending revamp](https://www.datopian.com/blog/the-open-spending-revamp). Now, let's dive into the tech stack that makes it tick. We'll unpack how PortalJS, Cloudflare R2, Frictionless Data Packages, and Octokit come together to power this next-level data portal. From our Javascript framework PortalJS, that shapes the user experience, to Cloudflare R2, the robust storage solution that secures the data, we'll examine how each piece of technology contributes to the bigger picture. We'll also delve into the roles of Frictionless Data Packages for metadata management and Octokit for automating dataset metadata retrieval. Read on for the inside scoop!
|
||||
|
||||
## The Core: PortalJS
|
||||
|
||||
At the core of the revamped OpenSpending website is [PortalJS](https://portaljs.com), a JavaScript library that's a game-changer in building powerful data portals with data visualizations. What makes it so special? Well, it's packed with reusable React components that make our lives - and yours - a whole lot easier. Take, for example, our sleek CSV previews; they're brought to life by PortalJS' [FlatUI Component](https://storybook.portaljs.org/?path=/story/components-flatuitable--from-url). It helps transform raw numbers into visuals that you can easily understand and use. Curious to know more? Check out the [official PortalJS website](https://portaljs.com).
|
||||
|
||||

|
||||
|
||||
## Metadata: Frictionless Data Packages
|
||||
|
||||
Storing metadata might seem like a backstage operation, but it is pivotal. We chose Frictionless Data Packages, housed in the `os-data` GitHub organization as repositories, to serve this purpose. Frictionless Data Packages offer a simple but powerful format for cataloging and packaging a collection of data - in our scenario, that's primarily tabular data. These aren't merely storage bins - they align with FAIR principles, ensuring that the data is easily Findable, Accessible, Interoperable, and Reusable. This alignment positions them as an ideal solution for publishing datasets designed to be both openly accessible and highly usable. Learn more from their [official documentation](https://framework.frictionlessdata.io/).
|
||||
|
||||
## The Link: Octokit
|
||||
|
||||
Can you imagine having to manually gather metadata for each dataset from multiple GitHub repositories? Sounds tedious, right? That’s why we used Octokit, a GitHub API client for Node.js. This tool takes care of the heavy lifting, automating the metadata retrieval process for us. If you're intrigued by Octokit's capabilities, you can discover more in its [GitHub repository](https://github.com/octokit/octokit.js). To explore the datasets we've been working on, take a look at [OpenSpending Datasets](https://github.com/os-data).
|
||||
|
||||
## Storage: Cloudflare R2
|
||||
|
||||
When it comes to data storage, Cloudflare R2 emerges as our choice, defined by its blend of speed and reliability. This service empowers developers to securely store large amounts of blob data without the costly egress bandwidth fees associated with typical cloud storage services. For a comprehensive understanding of Cloudflare R2, their [blog post](https://cloudflare.net/news/news-details/2021/Cloudflare-Announces-R2-Storage-Rapid-and-Reliable-S3-Compatible-Object-Storage-Designed-for-the-Edge/default.aspx) serves as an excellent resource.
|
||||
|
||||
## In Closing
|
||||
|
||||
In closing, we invite you to explore the architecture and code that power this project. It's all openly accessible in our [GitHub repository](https://github.com/datopian/portaljs/tree/main/examples/openspending). Should you want to experience the end result firsthand, feel free to visit [openspending.org](https://www.openspending.org/). If you encounter any issues or have suggestions to improve the project, we welcome your contributions via our [GitHub issues page](https://github.com/datopian/portaljs/issues). For real-time assistance and to engage with our community, don't hesitate to join our [Discord Channel](https://discord.com/invite/EeyfGrGu4U). Thank you for taking the time to read about our work! We look forward to fostering a collaborative environment where knowledge is freely shared and continually enriched. ♥️
|
||||
|
||||

|
||||
@@ -1,7 +1,7 @@
|
||||
const config = {
|
||||
title: 'DataHub PortalJS - The JavaScript framework for data portals.',
|
||||
title: 'PortalJS - The JavaScript framework for data portals.',
|
||||
description:
|
||||
'DataHub PortalJS is a JavaScript framework for rapidly building rich data portal frontends using a modern frontend approach.',
|
||||
'PortalJS is a JavaScript framework for rapidly building rich data portal frontends using a modern frontend approach.',
|
||||
theme: {
|
||||
default: 'dark',
|
||||
toggleIcon: '/images/theme-button.svg',
|
||||
@@ -11,18 +11,20 @@ const config = {
|
||||
authorUrl: 'https://datopian.com/',
|
||||
navbarTitle: {
|
||||
// logo: "/images/logo.svg",
|
||||
text: '🌀 DataHub PortalJS',
|
||||
text: '🌀 PortalJS',
|
||||
// version: "Alpha",
|
||||
},
|
||||
navLinks: [
|
||||
{ name: 'Docs', href: '/docs' },
|
||||
// { name: "Components", href: "/docs/components" },
|
||||
{ name: 'Blog', href: '/blog' },
|
||||
{ name: 'Showcases', href: '/#showcases' },
|
||||
{ name: 'Howtos', href: '/howtos' },
|
||||
{ name: 'Guide', href: '/guide' },
|
||||
{
|
||||
name: 'Showcases',
|
||||
href: '/showcases/'
|
||||
name: 'Examples',
|
||||
href: 'https://github.com/datopian/portaljs/tree/main/examples',
|
||||
target: '_blank',
|
||||
},
|
||||
{
|
||||
name: 'Components',
|
||||
@@ -67,8 +69,8 @@ const config = {
|
||||
cardType: 'summary_large_image',
|
||||
},
|
||||
},
|
||||
github: 'https://github.com/datopian/datahub',
|
||||
discord: 'https://discord.gg/KrRzMKU',
|
||||
github: 'https://github.com/datopian/portaljs',
|
||||
discord: 'https://discord.gg/EeyfGrGu4U',
|
||||
tableOfContents: true,
|
||||
analytics: 'G-96GWZHMH57',
|
||||
// editLinkShow: true,
|
||||
|
||||
@@ -26,7 +26,7 @@ Below are some screenshots:
|
||||
- Create a new app with `create-next-app`:
|
||||
|
||||
```
|
||||
npx create-next-app <app-name> --example https://github.com/datopian/datahub/tree/main/examples/ckan-example
|
||||
npx create-next-app <app-name> --example https://github.com/datopian/portaljs/tree/main/examples/ckan-example
|
||||
cd <app-name>
|
||||
```
|
||||
|
||||
@@ -49,7 +49,7 @@ If yo go to any one of those pages by clicking on `More info` you will see somet
|
||||
|
||||
## Deployment
|
||||
|
||||
[](https://vercel.com/new/clone?repository-url=https%3A%2F%2Fgithub.com%2Fdatopian%2Fdatahub%2Ftree%2Fmain%2Fexamples%2Fckan-example&env=DMS&envDescription=URL%20For%20the%20CKAN%20Backend%20Ex%3A%20https%3A%2F%2Fdemo.dev.datopian.com)
|
||||
[](https://vercel.com/new/clone?repository-url=https%3A%2F%2Fgithub.com%2Fdatopian%2Fportaljs%2Ftree%2Fmain%2Fexamples%2Fckan-example&env=DMS&envDescription=URL%20For%20the%20CKAN%20Backend%20Ex%3A%20https%3A%2F%2Fdemo.dev.datopian.com)
|
||||
|
||||
By clicking on this button, you will be redirected to a page which will allow you to clone the content into your own github/gitlab/bitbucket account and automatically deploy everything.
|
||||
|
||||
@@ -70,6 +70,6 @@ npm run start
|
||||
|
||||
## Links
|
||||
|
||||
- [Repo](https://github.com/datopian/datahub/tree/main/examples/ckan-example)
|
||||
- [Repo](https://github.com/datopian/portaljs/tree/main/examples/ckan-example)
|
||||
- [Live Demo](https://ckan-example.portaljs.org)
|
||||
|
||||
|
||||
@@ -26,7 +26,7 @@ To get a feel of the project, check out the demo at [live deployment](https://ck
|
||||
Navigate to the directory in which you want to create the project folder and run the following command:
|
||||
|
||||
```
|
||||
npx create-next-app <app-name> --example https://github.com/datopian/datahub/tree/main/examples/ckan
|
||||
npx create-next-app <app-name> --example https://github.com/datopian/portaljs/tree/main/examples/ckan
|
||||
cd <app-name>
|
||||
```
|
||||
|
||||
@@ -56,7 +56,7 @@ If you navigate to any of the dataset pages by clicking on the dataset title you
|
||||
|
||||
## Deployment
|
||||
|
||||
[](https://vercel.com/new/clone?repository-url=https%3A%2F%2Fgithub.com%2Fdatopian%2Fdatahub%2Ftree%2Fmain%2Fexamples%2Fckan&env=DMS&envDescription=URL%20For%20the%20CKAN%20Backend%20Ex%3A%20https%3A%2F%2Fdemo.dev.datopian.com)
|
||||
[](https://vercel.com/new/clone?repository-url=https%3A%2F%2Fgithub.com%2Fdatopian%2Fportaljs%2Ftree%2Fmain%2Fexamples%2Fckan&env=DMS&envDescription=URL%20For%20the%20CKAN%20Backend%20Ex%3A%20https%3A%2F%2Fdemo.dev.datopian.com)
|
||||
|
||||
By clicking on this button, you will be redirected to a page which allows you to clone the base project into your own GitHub/GitLab/BitBucket account and automatically deploy it.
|
||||
|
||||
@@ -158,6 +158,6 @@ Thanks to TypeScript, you can get a list of all the API methods in `@portaljs/ck
|
||||
|
||||
## Links
|
||||
|
||||
- [Repo](https://github.com/datopian/datahub/tree/main/examples/ckan)
|
||||
- [Repo](https://github.com/datopian/portaljs/tree/main/examples/ckan)
|
||||
- [Live Demo](http://ckan.portaljs.org/)
|
||||
|
||||
|
||||
@@ -0,0 +1,48 @@
|
||||
---
|
||||
title: "Example: showcase for a single Frictionless dataset"
|
||||
authors: ['Luccas Mateus']
|
||||
date: 2023-04-20
|
||||
filetype: blog
|
||||
---
|
||||
|
||||
**See the repo:** https://github.com/datopian/portaljs/tree/main/examples/dataset-frictionless
|
||||
|
||||
This example creates a portal/showcase for a single dataset. The dataset should be a [Frictionless dataset (data package)][fd] i.e. there should be a `datapackage.json`.
|
||||
|
||||
[fd]: https://frictionlessdata.io/data-packages/
|
||||
|
||||
## How to use
|
||||
|
||||
```bash
|
||||
npx create-next-app -e https://github.com/datopian/portaljs/tree/main/examples/dataset-frictionless
|
||||
# choose a name for your portal when prompted e.g. your-portal or go with default my-app
|
||||
|
||||
# then run it
|
||||
cd your-portal
|
||||
yarn #install packages
|
||||
yarn dev #start app in dev mode
|
||||
```
|
||||
|
||||
You should see the demo portal running with the example dataset provided:
|
||||
|
||||
<img src="/assets/examples/frictionless-dataset-demo.gif" />
|
||||
|
||||
### Use your own dataset
|
||||
|
||||
You can try it out with other [Frictionless datasets](https://datahub.io/search).
|
||||
|
||||
In the directory of your portal do:
|
||||
|
||||
```bash
|
||||
export PORTAL_DATASET_PATH=/path/to/my/dataset
|
||||
```
|
||||
|
||||
Then restart the dev server:
|
||||
|
||||
```
|
||||
yarn dev
|
||||
```
|
||||
|
||||
Check the portal page and it should have updated e.g. like:
|
||||
|
||||

|
||||
@@ -33,7 +33,7 @@ Run the following commands:
|
||||
|
||||
|
||||
```bash
|
||||
npx create-next-app <app-name> --example https://github.com/datopian/datahub/tree/main/examples/github-backed-catalog
|
||||
npx create-next-app <app-name> --example https://github.com/datopian/portaljs/tree/main/examples/github-backed-catalog
|
||||
cd <app-name>
|
||||
```
|
||||
|
||||
@@ -61,7 +61,7 @@ Congratulations, your new app is now running at http://localhost:3000.
|
||||
|
||||
## Deployment
|
||||
|
||||
[](https://vercel.com/new/clone?repository-url=https%3A%2F%2Fgithub.com%2Fdatopian%2Fdatahub%2Ftree%2Fmain%2Fexamples%2Fgithub-backed-catalog)
|
||||
[](https://vercel.com/new/clone?repository-url=https%3A%2F%2Fgithub.com%2Fdatopian%2Fportaljs%2Ftree%2Fmain%2Fexamples%2Fgithub-backed-catalog)
|
||||
|
||||
By clicking on this button, you will be redirected to a page which will allow you to clone the example into your own GitHub/GitLab/BitBucket account and automatically deploy it.
|
||||
|
||||
@@ -119,5 +119,5 @@ npm run start
|
||||
|
||||
## Links
|
||||
|
||||
- [Repo](https://github.com/datopian/datahub/tree/main/examples/github-backed-catalog)
|
||||
- [Repo](https://github.com/datopian/portaljs/tree/main/examples/github-backed-catalog)
|
||||
- [Live Demo](https://example.portaljs.org)
|
||||
|
||||
@@ -3,9 +3,9 @@ title: Getting Started
|
||||
description: 'Getting started guide and tutorial about data portal-building with PortalJS!'
|
||||
---
|
||||
|
||||
Welcome to the DataHub PortalJS documentation!
|
||||
Welcome to the PortalJS documentation!
|
||||
|
||||
If you have questions about anything related to PortalJS, you're always welcome to ask our community on [GitHub Discussions](https://github.com/datopian/datahub/discussions) or on [our chat channel on Discord](https://discord.com/invite/KrRzMKU).
|
||||
If you have questions about anything related to PortalJS, you're always welcome to ask our community on [GitHub Discussions](https://github.com/datopian/portaljs/discussions) or on [our chat channel on Discord](https://discord.gg/EeyfGrGu4U).
|
||||
|
||||
## Setup
|
||||
|
||||
@@ -16,10 +16,10 @@ If you have questions about anything related to PortalJS, you're always welcome
|
||||
|
||||
### Create a PortalJS app
|
||||
|
||||
To create a DataHub PortalJS app, open your terminal, cd into the directory you’d like to create the app in, and run the following command:
|
||||
To create a PortalJS app, open your terminal, cd into the directory you’d like to create the app in, and run the following command:
|
||||
|
||||
```bash
|
||||
npx create-next-app my-data-portal --example https://github.com/datopian/datahub/tree/main/examples/learn
|
||||
npx create-next-app my-data-portal --example https://github.com/datopian/portaljs/tree/main/examples/learn
|
||||
```
|
||||
|
||||
> [!tip]
|
||||
|
||||
@@ -29,5 +29,5 @@ It would be too complicated (and long) to explain all of the formatting aspects
|
||||
|
||||
## Other useful pages
|
||||
|
||||
[How to quickly add a simple Markdown-based page](https://www.portaljs.com/opensource/howtos/markdown)
|
||||
[How to quickly edit text content on a single Markdown-based page](https://www.portaljs.com/opensource/howtos/markdown)
|
||||
[How to quickly add a simple Markdown-based page](https://guide.portaljs.org/guides/add-a-simple-md-page)
|
||||
[How to quickly edit text content on a single Markdown-based page](https://guide.portaljs.org/guides/edit-text-on-a-single-md-page)
|
||||
|
||||
@@ -11,5 +11,5 @@ description: Learn more about how you can achieve different data portal features
|
||||
- [[howtos/drd|How to create data-rich documents with charts and tables?]]
|
||||
- [[howtos/comments|How to add user comments?]]
|
||||
|
||||
If you have questions about anything related to PortalJS, you're always welcome to ask our community on [GitHub Discussions](https://github.com/datopian/datahub/discussions) or on [our chat channel on Discord](https://discord.gg/EeyfGrGu4U).
|
||||
If you have questions about anything related to PortalJS, you're always welcome to ask our community on [GitHub Discussions](https://github.com/datopian/portaljs/discussions) or on [our chat channel on Discord](https://discord.gg/EeyfGrGu4U).
|
||||
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
/** @type {import('next-sitemap').IConfig} */
|
||||
module.exports = {
|
||||
siteUrl: process.env.SITE_URL || 'https://portaljs.com',
|
||||
siteUrl: process.env.SITE_URL || 'https://portaljs.org',
|
||||
generateRobotsTxt: true,
|
||||
robotsTxtOptions: {
|
||||
policies: [
|
||||
|
||||
@@ -50,7 +50,7 @@ function MyApp({ Component, pageProps }) {
|
||||
<DefaultSeo
|
||||
defaultTitle={siteConfig.title}
|
||||
description={siteConfig.description}
|
||||
titleTemplate="DataHub PortalJS - %s"
|
||||
titleTemplate="PortalJS - %s"
|
||||
{...siteConfig.nextSeo}
|
||||
/>
|
||||
|
||||
|
||||
@@ -26,8 +26,8 @@ export default function Home({ sidebarTree }) {
|
||||
return (
|
||||
<>
|
||||
<LogoJsonLd
|
||||
url="https://portaljs.com"
|
||||
logo="https://portaljs.com/icon.png"
|
||||
url="https://portaljs.org"
|
||||
logo="https://portaljs.org/icon.png"
|
||||
/>
|
||||
<Layout
|
||||
isHomePage={true}
|
||||
@@ -35,7 +35,7 @@ export default function Home({ sidebarTree }) {
|
||||
sidebarTree={sidebarTree}
|
||||
>
|
||||
<Features />
|
||||
|
||||
<Showcases />
|
||||
<Community />
|
||||
</Layout>
|
||||
</>
|
||||
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user