Compare commits
77 Commits
@portaljs/
...
fix/dotorg
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
a9025e5cbe | ||
|
|
ad5a176e85 | ||
|
|
eeb480e8cf | ||
|
|
30fcb256b2 | ||
|
|
a4f8c0ed76 | ||
|
|
829f3b1f13 | ||
|
|
836b143a31 | ||
|
|
be38086794 | ||
|
|
63d9e3b754 | ||
|
|
f86f0541eb | ||
|
|
64bc212384 | ||
|
|
1e7daf353d | ||
|
|
cc69dabf80 | ||
|
|
a5d87712e0 | ||
|
|
86834fd1a6 | ||
|
|
8a661b1617 | ||
|
|
1baebc3f3c | ||
|
|
bbac4954f5 | ||
|
|
be6b184884 | ||
|
|
64103d6488 | ||
|
|
8e3496782c | ||
|
|
e034503399 | ||
|
|
93ae498ec2 | ||
|
|
97e43fdcba | ||
|
|
32f29024f8 | ||
|
|
134f72948c | ||
|
|
c1f2c526a8 | ||
|
|
8feb87739d | ||
|
|
3a07267e44 | ||
|
|
3f19ca16ed | ||
|
|
5deabac5fe | ||
|
|
96901150c6 | ||
|
|
9ff25ed7c4 | ||
|
|
8f884fceab | ||
|
|
7094eded50 | ||
|
|
30e7c6379f | ||
|
|
feada58932 | ||
|
|
31406d48e3 | ||
|
|
d6bf344ca3 | ||
|
|
d1a5138c6e | ||
|
|
a6047a9341 | ||
|
|
a4e60540ae | ||
|
|
e4c456c237 | ||
|
|
ce9ebbf41e | ||
|
|
a8fb176bcc | ||
|
|
2ac82367c5 | ||
|
|
85de6f7878 | ||
|
|
539fffeb55 | ||
|
|
0d276535bd | ||
|
|
38dd7103a3 | ||
|
|
48cd812a48 | ||
|
|
7bba10714d | ||
|
|
de2c1e5b48 | ||
|
|
57952e0817 | ||
|
|
df9664624f | ||
|
|
2ea185b710 | ||
|
|
b859d48f17 | ||
|
|
3d73ac422e | ||
|
|
059ffe4e34 | ||
|
|
0aed7dce77 | ||
|
|
c202d6cfc4 | ||
|
|
d9c20528c5 | ||
|
|
b7ee5a1869 | ||
|
|
4b5d549190 | ||
|
|
e6f0ab4ec8 | ||
|
|
22038fbd4f | ||
|
|
8b292a9bf2 | ||
|
|
cda3d335f1 | ||
|
|
fe97cc87f4 | ||
|
|
88f6199d18 | ||
|
|
852cf60abc | ||
|
|
704be0d5a7 | ||
|
|
fb3598fa49 | ||
|
|
d898b5a833 | ||
|
|
1a8e7ac06e | ||
|
|
4355efe0c4 | ||
|
|
9e73410b17 |
8
.vscode/extensions.json
vendored
8
.vscode/extensions.json
vendored
@@ -1,8 +0,0 @@
|
||||
{
|
||||
"recommendations": [
|
||||
"nrwl.angular-console",
|
||||
"esbenp.prettier-vscode",
|
||||
"firsttris.vscode-jest-runner",
|
||||
"dbaeumer.vscode-eslint"
|
||||
]
|
||||
}
|
||||
@@ -4,7 +4,7 @@ title: Developer docs for contributors
|
||||
|
||||
## Our repository
|
||||
|
||||
https://github.com/datopian/portaljs
|
||||
https://github.com/datopian/datahub
|
||||
|
||||
Structure:
|
||||
|
||||
@@ -17,7 +17,7 @@ Structure:
|
||||
|
||||
## How to contribute
|
||||
|
||||
You can start by checking our [issues board](https://github.com/datopian/portaljs/issues).
|
||||
You can start by checking our [issues board](https://github.com/datopian/datahub/issues).
|
||||
|
||||
If you'd like to work on one of the issues you can:
|
||||
|
||||
@@ -35,7 +35,7 @@ If you'd like to work on one of the issues you can:
|
||||
If you have an idea for improvement, and it doesn't have a corresponding issue yet, simply submit a new one.
|
||||
|
||||
> [!note]
|
||||
> Join our [Discord channel](https://discord.gg/rTxfCutu) do discuss existing issues and to ask for help.
|
||||
> Join our [Discord channel](https://discord.gg/KZSf3FG4EZ) do discuss existing issues and to ask for help.
|
||||
|
||||
## Nx
|
||||
|
||||
|
||||
77
README.md
77
README.md
@@ -1,31 +1,56 @@
|
||||
<h1 align="center">
|
||||
🌀 Portal.JS
|
||||
<br />
|
||||
Rapidly build rich data portals using a modern frontend framework
|
||||
<a href="https://datahub.io/">
|
||||
<img alt="datahub" src="http://datahub.io/datahub-cube.svg" width="146">
|
||||
</a>
|
||||
</h1>
|
||||
|
||||
* [What is Portal.JS ?](#What-is-Portal.JS)
|
||||
* [Features](#Features)
|
||||
* [For developers](#For-developers)
|
||||
* [Docs](#Docs)
|
||||
* [Community](#Community)
|
||||
* [Appendix](#Appendix)
|
||||
* [What happened to Recline?](#What-happened-to-Recline?)
|
||||
<p align="center">
|
||||
Bugs, issues and suggestions re DataHub Cloud ☁️ and DataHub OpenSource 🌀
|
||||
<br />
|
||||
<br /><a href="https://discord.gg/xfFDMPU9dC"><img src="https://dcbadge.vercel.app/api/server/xfFDMPU9dC" /></a>
|
||||
</p>
|
||||
|
||||
# What is Portal.JS
|
||||
## DataHub
|
||||
|
||||
🌀 Portal.JS is a framework for rapidly building rich data portal frontends using a modern frontend approach. Portal.JS can be used to present a single dataset or build a full-scale data catalog/portal.
|
||||
This repo and issue tracker are for
|
||||
|
||||
Built in JavaScript and React on top of the popular [Next.js](https://nextjs.com/) framework. Portal.JS assumes a "decoupled" approach where the frontend is a separate service from the backend and interacts with backend(s) via an API. It can be used with any backend and has out of the box support for [CKAN](https://ckan.org/).
|
||||
- DataHub Cloud ☁️ - https://datahub.io/
|
||||
- DataHub 🌀 - https://datahub.io/opensource
|
||||
|
||||
## Features
|
||||
### Issues
|
||||
|
||||
Found a bug: 👉 https://github.com/datopian/datahub/issues/new
|
||||
|
||||
### Discussions
|
||||
|
||||
Got a suggestion, a question, want some support or just want to shoot the breeze 🙂
|
||||
|
||||
Head to the discussion forum: 👉 https://github.com/datopian/datahub/discussions
|
||||
|
||||
### Chat on Discord
|
||||
|
||||
If you would prefer to get help via live chat check out our discord 👉
|
||||
|
||||
[Discord](https://discord.gg/xfFDMPU9dC)
|
||||
|
||||
### Docs
|
||||
|
||||
https://datahub.io/docs
|
||||
|
||||
## DataHub OpenSource 🌀
|
||||
|
||||
DataHub 🌀 is a platform for rapidly creating rich data portal and publishing systems using a modern frontend approach. Datahub can be used to publish a single dataset or build a full-scale data catalog/portal.
|
||||
|
||||
DataHub is built in JavaScript and React on top of the popular [Next.js](https://nextjs.org) framework. DataHub assumes a "decoupled" approach where the frontend is a separate service from the backend and interacts with backend(s) via an API. It can be used with any backend and has out of the box support for [CKAN](https://ckan.org/), GitHub, Frictionless Data Packages and more.
|
||||
|
||||
### Features
|
||||
|
||||
- 🗺️ Unified sites: present data and content in one seamless site, pulling datasets from a DMS (e.g. CKAN) and content from a CMS (e.g. Wordpress) with a common internal API.
|
||||
- 👩💻 Developer friendly: built with familiar frontend tech (JavaScript, React, Next.js).
|
||||
- 🔋 Batteries included: full set of portal components out of the box e.g. catalog search, dataset showcase, blog, etc.
|
||||
- 🎨 Easy to theme and customize: installable themes, use standard CSS and React+CSS tooling. Add new routes quickly.
|
||||
- 🧱 Extensible: quickly extend and develop/import your own React components
|
||||
- 📝 Well documented: full set of documentation plus the documentation of Next.js and Apollo.
|
||||
- 📝 Well documented: full set of documentation plus the documentation of Next.js.
|
||||
|
||||
### For developers
|
||||
|
||||
@@ -33,25 +58,3 @@ Built in JavaScript and React on top of the popular [Next.js](https://nextjs.com
|
||||
- 🚀 Next.js framework: so everything in Next.js for free: Server Side Rendering, Static Site Generation, huge number of examples and integrations, etc.
|
||||
- Server Side Rendering (SSR) => Unlimited number of pages, SEO and more whilst still using React.
|
||||
- Static Site Generation (SSG) => Ultra-simple deployment, great performance, great lighthouse scores and more (good for small sites)
|
||||
|
||||
#### **Check out the [Portal.JS website](https://portaljs.org/) for a gallery of live portals**
|
||||
|
||||
___
|
||||
|
||||
# Docs
|
||||
|
||||
Access the Portal.JS documentation at:
|
||||
|
||||
https://portaljs.org/docs
|
||||
|
||||
- [Examples](https://portaljs.org/docs#examples)
|
||||
|
||||
# Community
|
||||
|
||||
If you have questions about anything related to Portal.JS, you're always welcome to ask our community on [GitHub Discussions](https://github.com/datopian/portal.js/discussions) or on our [Discord server](https://discord.gg/EeyfGrGu4U).
|
||||
|
||||
# Appendix
|
||||
|
||||
## What happened to Recline?
|
||||
|
||||
Portal.JS used to be Recline(JS). If you are looking for the old Recline codebase it still exists: see the [`recline` branch](https://github.com/datopian/portal.js/tree/recline). If you want context for the rename see [this issue](https://github.com/datopian/portal.js/issues/520).
|
||||
|
||||
@@ -1,7 +1,7 @@
|
||||
This is a repo intended to serve as an example of a data catalog that get its data from a CKAN Instance.
|
||||
|
||||
```
|
||||
npx create-next-app <app-name> --example https://github.com/datopian/portaljs/tree/main/examples/ckan-example
|
||||
npx create-next-app <app-name> --example https://github.com/datopian/datahub/tree/main/examples/ckan-ssg
|
||||
cd <app-name>
|
||||
```
|
||||
|
||||
@@ -19,7 +19,7 @@ npm run dev
|
||||
|
||||
Congratulations, you now have something similar to this running on `http://localhost:4200`
|
||||

|
||||
If yo go to any one of those pages by clicking on `More info` you will see something similar to this
|
||||
If you go to any one of those pages by clicking on `More info` you will see something similar to this
|
||||

|
||||
|
||||
## Deployment
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
This example creates a portal/showcase for a single dataset. The dataset should be a [Frictionless dataset (data package)][fd] i.e. there should be a `datapackage.json`.
|
||||
|
||||
[fd]: https://frictionlessdata.io/data-packages/
|
||||
[fd]: https://specs.frictionlessdata.io/data-package/
|
||||
|
||||
## How to use
|
||||
|
||||
|
||||
4
package-lock.json
generated
4
package-lock.json
generated
@@ -49897,7 +49897,7 @@
|
||||
},
|
||||
"packages/components": {
|
||||
"name": "@portaljs/components",
|
||||
"version": "0.5.10",
|
||||
"version": "1.2.0",
|
||||
"dependencies": {
|
||||
"@githubocto/flat-ui": "^0.14.1",
|
||||
"@heroicons/react": "^2.0.17",
|
||||
@@ -50383,7 +50383,7 @@
|
||||
},
|
||||
"packages/remark-wiki-link": {
|
||||
"name": "@portaljs/remark-wiki-link",
|
||||
"version": "1.1.2",
|
||||
"version": "1.2.0",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"mdast-util-to-markdown": "^1.5.0",
|
||||
|
||||
@@ -1,9 +1,16 @@
|
||||
import 'tailwindcss/tailwind.css'
|
||||
import '../src/index.css'
|
||||
|
||||
|
||||
import type { Preview } from '@storybook/react';
|
||||
|
||||
window.process = {
|
||||
...window.process,
|
||||
env:{
|
||||
...window.process?.env,
|
||||
|
||||
|
||||
}
|
||||
};
|
||||
|
||||
const preview: Preview = {
|
||||
parameters: {
|
||||
actions: { argTypesRegex: '^on[A-Z].*' },
|
||||
|
||||
@@ -1,5 +1,41 @@
|
||||
# @portaljs/components
|
||||
|
||||
## 1.2.2
|
||||
|
||||
### Patch Changes
|
||||
|
||||
- [`eeb480e8`](https://github.com/datopian/datahub/commit/eeb480e8cff2d11072ace55ad683a65f54f5d07a) Thanks [@olayway](https://github.com/olayway)! - Adjust `xAxisTimeUnit` property in LineChart to allow for passing `yearmonth`.
|
||||
|
||||
## 1.2.1
|
||||
|
||||
### Patch Changes
|
||||
|
||||
- [`836b143a`](https://github.com/datopian/datahub/commit/836b143a3178b893b1aae3fb511d795dd3a63545) Thanks [@olayway](https://github.com/olayway)! - Fix: make tileLayerName in Map optional.
|
||||
|
||||
## 1.2.0
|
||||
|
||||
### Minor Changes
|
||||
|
||||
- [#1338](https://github.com/datopian/datahub/pull/1338) [`63d9e3b7`](https://github.com/datopian/datahub/commit/63d9e3b7543c38154e6989ef1cc1d694ae9fc4f8) Thanks [@olayway](https://github.com/olayway)! - Support for plotting multiple series in LineChart component.
|
||||
|
||||
## 1.1.0
|
||||
|
||||
### Minor Changes
|
||||
|
||||
- [#1122](https://github.com/datopian/datahub/pull/1122) [`8e349678`](https://github.com/datopian/datahub/commit/8e3496782c022b0653e07f217c6b315ba84e0e61) Thanks [@willy1989cv](https://github.com/willy1989cv)! - Map: allow users to choose a base layer setting
|
||||
|
||||
## 1.0.1
|
||||
|
||||
### Patch Changes
|
||||
|
||||
- [#1170](https://github.com/datopian/datahub/pull/1170) [`9ff25ed7`](https://github.com/datopian/datahub/commit/9ff25ed7c47c8c02cc078c64f76ae35d6754c508) Thanks [@lucasmbispo](https://github.com/lucasmbispo)! - iFrame component: change height
|
||||
|
||||
## 1.0.0
|
||||
|
||||
### Major Changes
|
||||
|
||||
- [#1103](https://github.com/datopian/datahub/pull/1103) [`48cd812a`](https://github.com/datopian/datahub/commit/48cd812a488a069a419d8ecc67f24f94d4d1d1d6) Thanks [@demenech](https://github.com/demenech)! - Components API tidying up and storybook docs improvements.
|
||||
|
||||
## 0.6.0
|
||||
|
||||
### Minor Changes
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
{
|
||||
"name": "@portaljs/components",
|
||||
"version": "0.6.0",
|
||||
"version": "1.2.2",
|
||||
"type": "module",
|
||||
"description": "https://portaljs.org",
|
||||
"keywords": [
|
||||
|
||||
@@ -7,7 +7,12 @@ export function Catalog({
|
||||
datasets,
|
||||
facets,
|
||||
}: {
|
||||
datasets: any[];
|
||||
datasets: {
|
||||
_id: string | number;
|
||||
metadata: { title: string; [k: string]: string | number };
|
||||
url_path: string;
|
||||
[k: string]: any;
|
||||
}[];
|
||||
facets: string[];
|
||||
}) {
|
||||
const [indexFilter, setIndexFilter] = useState('');
|
||||
@@ -56,7 +61,7 @@ export function Catalog({
|
||||
//Then check if the selectedValue for the given facet is included in the dataset metadata
|
||||
.filter((dataset) => {
|
||||
//Avoids a server rendering breakage
|
||||
if (!watch() || Object.keys(watch()).length === 0) return true
|
||||
if (!watch() || Object.keys(watch()).length === 0) return true;
|
||||
//This will filter only the key pairs of the metadata values that were selected as facets
|
||||
const datasetFacets = Object.entries(dataset.metadata).filter((entry) =>
|
||||
facets.includes(entry[0])
|
||||
@@ -86,9 +91,7 @@ export function Catalog({
|
||||
className="p-2 ml-1 text-sm shadow border border-block"
|
||||
{...register(elem[0] + '.selectedValue')}
|
||||
>
|
||||
<option value="">
|
||||
Filter by {elem[0]}
|
||||
</option>
|
||||
<option value="">Filter by {elem[0]}</option>
|
||||
{(elem[1] as { possibleValues: string[] }).possibleValues.map(
|
||||
(val) => (
|
||||
<option
|
||||
@@ -102,10 +105,10 @@ export function Catalog({
|
||||
)}
|
||||
</select>
|
||||
))}
|
||||
<ul className='mb-5 pl-6 mt-5 list-disc'>
|
||||
<ul className="mb-5 pl-6 mt-5 list-disc">
|
||||
{filteredDatasets.map((dataset) => (
|
||||
<li className='py-2' key={dataset._id}>
|
||||
<a className='font-medium underline' href={dataset.url_path}>
|
||||
<li className="py-2" key={dataset._id}>
|
||||
<a className="font-medium underline" href={dataset.url_path}>
|
||||
{dataset.metadata.title
|
||||
? dataset.metadata.title
|
||||
: dataset.url_path}
|
||||
@@ -116,4 +119,3 @@ export function Catalog({
|
||||
</>
|
||||
);
|
||||
}
|
||||
|
||||
|
||||
@@ -4,12 +4,14 @@ import { read, utils } from 'xlsx';
|
||||
import { AgGridReact } from 'ag-grid-react';
|
||||
import 'ag-grid-community/styles/ag-grid.css';
|
||||
import 'ag-grid-community/styles/ag-theme-alpine.css';
|
||||
import { Data } from '../types/properties';
|
||||
|
||||
export type ExcelProps = {
|
||||
url: string;
|
||||
data: Required<Pick<Data, 'url'>>;
|
||||
};
|
||||
|
||||
export function Excel({ url }: ExcelProps) {
|
||||
export function Excel({ data }: ExcelProps) {
|
||||
const url = data.url;
|
||||
const [isLoading, setIsLoading] = useState<boolean>(false);
|
||||
const [activeSheetName, setActiveSheetName] = useState<string>();
|
||||
const [workbook, setWorkbook] = useState<any>();
|
||||
|
||||
@@ -2,6 +2,7 @@ import { QueryClient, QueryClientProvider, useQuery } from 'react-query';
|
||||
import Papa from 'papaparse';
|
||||
import { Grid } from '@githubocto/flat-ui';
|
||||
import LoadingSpinner from './LoadingSpinner';
|
||||
import { Data } from '../types/properties';
|
||||
|
||||
const queryClient = new QueryClient();
|
||||
|
||||
@@ -36,30 +37,25 @@ export async function parseCsv(file: string, parsingConfig): Promise<any> {
|
||||
}
|
||||
|
||||
export interface FlatUiTableProps {
|
||||
url?: string;
|
||||
data?: { [key: string]: number | string }[];
|
||||
rawCsv?: string;
|
||||
randomId?: number;
|
||||
data: Data;
|
||||
uniqueId?: number;
|
||||
bytes: number;
|
||||
parsingConfig: any;
|
||||
}
|
||||
export const FlatUiTable: React.FC<FlatUiTableProps> = ({
|
||||
url,
|
||||
data,
|
||||
rawCsv,
|
||||
uniqueId,
|
||||
bytes = 5132288,
|
||||
parsingConfig = {},
|
||||
}) => {
|
||||
const randomId = Math.random();
|
||||
uniqueId = uniqueId ?? Math.random();
|
||||
return (
|
||||
// Provide the client to your App
|
||||
<QueryClientProvider client={queryClient}>
|
||||
<TableInner
|
||||
bytes={bytes}
|
||||
url={url}
|
||||
data={data}
|
||||
rawCsv={rawCsv}
|
||||
randomId={randomId}
|
||||
uniqueId={uniqueId}
|
||||
parsingConfig={parsingConfig}
|
||||
/>
|
||||
</QueryClientProvider>
|
||||
@@ -67,33 +63,32 @@ export const FlatUiTable: React.FC<FlatUiTableProps> = ({
|
||||
};
|
||||
|
||||
const TableInner: React.FC<FlatUiTableProps> = ({
|
||||
url,
|
||||
data,
|
||||
rawCsv,
|
||||
randomId,
|
||||
uniqueId,
|
||||
bytes,
|
||||
parsingConfig,
|
||||
}) => {
|
||||
if (data) {
|
||||
const url = data.url;
|
||||
const csv = data.csv;
|
||||
const values = data.values;
|
||||
|
||||
if (values) {
|
||||
return (
|
||||
<div className="w-full" style={{ height: '500px' }}>
|
||||
<Grid data={data} />
|
||||
<Grid data={values} />
|
||||
</div>
|
||||
);
|
||||
}
|
||||
const { data: csvString, isLoading: isDownloadingCSV } = useQuery(
|
||||
['dataCsv', url, randomId],
|
||||
['dataCsv', url, uniqueId],
|
||||
() => getCsv(url as string, bytes),
|
||||
{ enabled: !!url }
|
||||
);
|
||||
const { data: parsedData, isLoading: isParsing } = useQuery(
|
||||
['dataPreview', csvString, randomId],
|
||||
['dataPreview', csvString, uniqueId],
|
||||
() =>
|
||||
parseCsv(
|
||||
rawCsv ? (rawCsv as string) : (csvString as string),
|
||||
parsingConfig
|
||||
),
|
||||
{ enabled: rawCsv ? true : !!csvString }
|
||||
parseCsv(csv ? (csv as string) : (csvString as string), parsingConfig),
|
||||
{ enabled: csv ? true : !!csvString }
|
||||
);
|
||||
if (isParsing || isDownloadingCSV)
|
||||
<div className="w-full flex justify-center items-center h-[500px]">
|
||||
|
||||
@@ -1,14 +1,17 @@
|
||||
import { CSSProperties } from "react";
|
||||
import { CSSProperties } from 'react';
|
||||
import { Data } from '../types/properties';
|
||||
|
||||
export interface IframeProps {
|
||||
url: string;
|
||||
data: Required<Pick<Data, 'url'>>;
|
||||
style?: CSSProperties;
|
||||
}
|
||||
|
||||
export function Iframe({
|
||||
url, style
|
||||
}: IframeProps) {
|
||||
export function Iframe({ data, style }: IframeProps) {
|
||||
const url = data.url;
|
||||
return (
|
||||
<iframe src={url} style={style ?? { width: `100%`, height: `100%` }}></iframe>
|
||||
<iframe
|
||||
src={url}
|
||||
style={style ?? { width: `100%`, height: `600px` }}
|
||||
></iframe>
|
||||
);
|
||||
}
|
||||
|
||||
@@ -2,35 +2,40 @@ import { useEffect, useState } from 'react';
|
||||
import LoadingSpinner from './LoadingSpinner';
|
||||
import { VegaLite } from './VegaLite';
|
||||
import loadData from '../lib/loadData';
|
||||
import { Data } from '../types/properties';
|
||||
|
||||
type AxisType = 'quantitative' | 'temporal';
|
||||
type TimeUnit = 'year' | undefined; // or ...
|
||||
type TimeUnit = 'year' | 'yearmonth' | undefined; // or ...
|
||||
|
||||
export type LineChartProps = {
|
||||
data: Array<Array<string | number>> | string | { x: string; y: number }[];
|
||||
data: Omit<Data, 'csv'>;
|
||||
title?: string;
|
||||
xAxis?: string;
|
||||
xAxis: string;
|
||||
xAxisType?: AxisType;
|
||||
xAxisTimeUnit: TimeUnit;
|
||||
yAxis?: string;
|
||||
xAxisTimeUnit?: TimeUnit;
|
||||
yAxis: string | string[];
|
||||
yAxisType?: AxisType;
|
||||
fullWidth?: boolean;
|
||||
symbol?: string;
|
||||
};
|
||||
|
||||
export function LineChart({
|
||||
data = [],
|
||||
fullWidth = false,
|
||||
data,
|
||||
title = '',
|
||||
xAxis = 'x',
|
||||
xAxis,
|
||||
xAxisType = 'temporal',
|
||||
xAxisTimeUnit = 'year', // TODO: defaults to undefined would probably work better... keeping it as it's for compatibility purposes
|
||||
yAxis = 'y',
|
||||
yAxis,
|
||||
yAxisType = 'quantitative',
|
||||
symbol,
|
||||
}: LineChartProps) {
|
||||
const url = data.url;
|
||||
const values = data.values;
|
||||
const [isLoading, setIsLoading] = useState<boolean>(false);
|
||||
|
||||
// By default, assumes data is an Array...
|
||||
const [specData, setSpecData] = useState<any>({ name: 'table' });
|
||||
const isMultiYAxis = Array.isArray(yAxis);
|
||||
|
||||
const spec = {
|
||||
$schema: 'https://vega.github.io/schema/vega-lite/v5.json',
|
||||
@@ -44,6 +49,11 @@ export function LineChart({
|
||||
tooltip: true,
|
||||
},
|
||||
data: specData,
|
||||
...(isMultiYAxis
|
||||
? {
|
||||
transform: [{ fold: yAxis, as: ['key', 'value'] }],
|
||||
}
|
||||
: {}),
|
||||
selection: {
|
||||
grid: {
|
||||
type: 'interval',
|
||||
@@ -57,20 +67,35 @@ export function LineChart({
|
||||
type: xAxisType,
|
||||
},
|
||||
y: {
|
||||
field: yAxis,
|
||||
field: isMultiYAxis ? 'value' : yAxis,
|
||||
type: yAxisType,
|
||||
},
|
||||
...(symbol
|
||||
? {
|
||||
color: {
|
||||
field: symbol,
|
||||
type: 'nominal',
|
||||
},
|
||||
}
|
||||
: {}),
|
||||
...(isMultiYAxis
|
||||
? {
|
||||
color: {
|
||||
field: 'key',
|
||||
type: 'nominal',
|
||||
},
|
||||
}
|
||||
: {}),
|
||||
},
|
||||
} as any;
|
||||
|
||||
useEffect(() => {
|
||||
// If data is string, assume it's a URL
|
||||
if (typeof data === 'string') {
|
||||
if (url) {
|
||||
setIsLoading(true);
|
||||
|
||||
// Manualy loading the data allows us to do other kinds
|
||||
// of stuff later e.g. load a file partially
|
||||
loadData(data).then((res: any) => {
|
||||
loadData(url).then((res: any) => {
|
||||
setSpecData({ values: res, format: { type: 'csv' } });
|
||||
setIsLoading(false);
|
||||
});
|
||||
@@ -78,12 +103,8 @@ export function LineChart({
|
||||
}, []);
|
||||
|
||||
var vegaData = {};
|
||||
if (Array.isArray(data)) {
|
||||
var dataObj;
|
||||
dataObj = data.map((r) => {
|
||||
return { x: r[0], y: r[1] };
|
||||
});
|
||||
vegaData = { table: dataObj };
|
||||
if (values) {
|
||||
vegaData = { table: values };
|
||||
}
|
||||
|
||||
return isLoading ? (
|
||||
@@ -91,6 +112,6 @@ export function LineChart({
|
||||
<LoadingSpinner />
|
||||
</div>
|
||||
) : (
|
||||
<VegaLite fullWidth={fullWidth} data={vegaData} spec={spec} />
|
||||
<VegaLite data={vegaData} spec={spec} />
|
||||
);
|
||||
}
|
||||
|
||||
@@ -2,6 +2,7 @@ import { CSSProperties, useEffect, useState } from 'react';
|
||||
import LoadingSpinner from './LoadingSpinner';
|
||||
import loadData from '../lib/loadData';
|
||||
import chroma from 'chroma-js';
|
||||
import { GeospatialData } from '../types/properties';
|
||||
import {
|
||||
MapContainer,
|
||||
TileLayer,
|
||||
@@ -11,10 +12,34 @@ import {
|
||||
|
||||
import 'leaflet/dist/leaflet.css';
|
||||
import * as L from 'leaflet';
|
||||
import providers from '../lib/tileLayerPresets';
|
||||
|
||||
type VariantKeys<T> = T extends { variants: infer V }
|
||||
? {
|
||||
[K in keyof V]: K extends string
|
||||
? `${K}` | `${K}.${VariantKeys<V[K]>}`
|
||||
: never;
|
||||
}[keyof V]
|
||||
: never;
|
||||
|
||||
type ProviderVariantKeys<T> = {
|
||||
[K in keyof T]: K extends string
|
||||
? `${K}` | `${K}.${VariantKeys<T[K]>}`
|
||||
: never;
|
||||
}[keyof T];
|
||||
|
||||
type TileLayerPreset = ProviderVariantKeys<typeof providers> | 'custom';
|
||||
|
||||
interface TileLayerSettings extends L.TileLayerOptions {
|
||||
url?: string;
|
||||
variant?: string | any;
|
||||
}
|
||||
|
||||
export type MapProps = {
|
||||
tileLayerName?: TileLayerPreset;
|
||||
tileLayerOptions?: TileLayerSettings | undefined;
|
||||
layers: {
|
||||
data: string | GeoJSON.GeoJSON;
|
||||
data: GeospatialData;
|
||||
name: string;
|
||||
colorScale?: {
|
||||
starting: string;
|
||||
@@ -25,18 +50,29 @@ export type MapProps = {
|
||||
propNames: string[];
|
||||
}
|
||||
| boolean;
|
||||
_id?: number;
|
||||
}[];
|
||||
title?: string;
|
||||
center?: { latitude: number | undefined; longitude: number | undefined };
|
||||
zoom?: number;
|
||||
style?: CSSProperties;
|
||||
autoZoomConfiguration?: {
|
||||
layerName: string
|
||||
}
|
||||
layerName: string;
|
||||
};
|
||||
};
|
||||
|
||||
const tileLayerDefaultName = process?.env
|
||||
.NEXT_PUBLIC_MAP_TILE_LAYER_NAME as TileLayerPreset;
|
||||
|
||||
const tileLayerDefaultOptions = Object.keys(process?.env)
|
||||
.filter((key) => key.startsWith('NEXT_PUBLIC_MAP_TILE_LAYER_OPTION_'))
|
||||
.reduce((obj, key) => {
|
||||
obj[key.split('NEXT_PUBLIC_MAP_TILE_LAYER_OPTION_')[1]] = process.env[key];
|
||||
return obj;
|
||||
}, {}) as TileLayerSettings;
|
||||
|
||||
export function Map({
|
||||
tileLayerName = tileLayerDefaultName || 'OpenStreetMap',
|
||||
tileLayerOptions,
|
||||
layers = [
|
||||
{
|
||||
data: null,
|
||||
@@ -54,19 +90,110 @@ export function Map({
|
||||
const [isLoading, setIsLoading] = useState<boolean>(false);
|
||||
const [layersData, setLayersData] = useState<any>([]);
|
||||
|
||||
/*
|
||||
tileLayerDefaultOptions
|
||||
extract all environment variables thats starts with NEXT_PUBLIC_MAP_TILE_LAYER_OPTION_.
|
||||
the variables names are the same as the TileLayer object properties:
|
||||
- NEXT_PUBLIC_MAP_TILE_LAYER_OPTION_url:
|
||||
- NEXT_PUBLIC_MAP_TILE_LAYER_OPTION_attribution
|
||||
- NEXT_PUBLIC_MAP_TILE_LAYER_OPTION_accessToken
|
||||
- NEXT_PUBLIC_MAP_TILE_LAYER_OPTION_id
|
||||
- NEXT_PUBLIC_MAP_TILE_LAYER_OPTION_ext
|
||||
- NEXT_PUBLIC_MAP_TILE_LAYER_OPTION_bounds
|
||||
- NEXT_PUBLIC_MAP_TILE_LAYER_OPTION_maxZoom
|
||||
- NEXT_PUBLIC_MAP_TILE_LAYER_OPTION_minZoom
|
||||
see TileLayerOptions inteface
|
||||
*/
|
||||
|
||||
//tileLayerData prioritizes properties passed through component over those passed through .env variables
|
||||
tileLayerOptions = Object.assign(tileLayerDefaultOptions, tileLayerOptions);
|
||||
|
||||
let provider = {
|
||||
url: tileLayerOptions.url,
|
||||
options: tileLayerOptions,
|
||||
};
|
||||
|
||||
if (tileLayerName != 'custom') {
|
||||
var parts = tileLayerName.split('.');
|
||||
var providerName = parts[0];
|
||||
var variantName: string = parts[1];
|
||||
|
||||
//make sure to declare a variant if url depends on a variant: assume first
|
||||
if (providers[providerName].url?.includes('{variant}') && !variantName)
|
||||
variantName = Object.keys(providers[providerName].variants)[0];
|
||||
|
||||
if (!providers[providerName]) {
|
||||
throw 'No such provider (' + providerName + ')';
|
||||
}
|
||||
|
||||
provider = {
|
||||
url: providers[providerName].url,
|
||||
options: providers[providerName].options,
|
||||
};
|
||||
|
||||
// overwrite values in provider from variant.
|
||||
if (variantName && 'variants' in providers[providerName]) {
|
||||
if (!(variantName in providers[providerName].variants)) {
|
||||
throw 'No such variant of ' + providerName + ' (' + variantName + ')';
|
||||
}
|
||||
var variant = providers[providerName].variants[variantName];
|
||||
var variantOptions;
|
||||
if (typeof variant === 'string') {
|
||||
variantOptions = {
|
||||
variant: variant,
|
||||
};
|
||||
} else {
|
||||
variantOptions = variant.options;
|
||||
}
|
||||
provider = {
|
||||
url: variant.url || provider.url,
|
||||
options: L.Util.extend({}, provider.options, variantOptions),
|
||||
};
|
||||
}
|
||||
|
||||
var attributionReplacer = function (attr) {
|
||||
if (attr.indexOf('{attribution.') === -1) {
|
||||
return attr;
|
||||
}
|
||||
return attr.replace(
|
||||
/\{attribution.(\w*)\}/g,
|
||||
function (match: any, attributionName: string) {
|
||||
match;
|
||||
return attributionReplacer(
|
||||
providers[attributionName].options.attribution
|
||||
);
|
||||
}
|
||||
);
|
||||
};
|
||||
|
||||
provider.options.attribution = attributionReplacer(
|
||||
provider.options.attribution
|
||||
);
|
||||
}
|
||||
|
||||
var tileLayerData = L.Util.extend(
|
||||
{
|
||||
url: provider.url,
|
||||
},
|
||||
provider.options,
|
||||
tileLayerOptions
|
||||
);
|
||||
|
||||
useEffect(() => {
|
||||
const loadDataPromises = layers.map(async (layer) => {
|
||||
const url = layer.data.url;
|
||||
const geojson = layer.data.geojson;
|
||||
let layerData: any;
|
||||
|
||||
if (typeof layer.data === 'string') {
|
||||
if (url) {
|
||||
// If "data" is string, assume it's a URL
|
||||
setIsLoading(true);
|
||||
layerData = await loadData(layer.data).then((res: any) => {
|
||||
layerData = await loadData(url).then((res: any) => {
|
||||
return JSON.parse(res);
|
||||
});
|
||||
} else {
|
||||
// Else, expect raw GeoJSON
|
||||
layerData = layer.data;
|
||||
layerData = geojson;
|
||||
}
|
||||
|
||||
if (layer.colorScale) {
|
||||
@@ -98,6 +225,7 @@ export function Map({
|
||||
</div>
|
||||
) : (
|
||||
<MapContainer
|
||||
key={layersData}
|
||||
center={[center.latitude, center.longitude]}
|
||||
zoom={zoom}
|
||||
scrollWheelZoom={false}
|
||||
@@ -111,23 +239,23 @@ export function Map({
|
||||
// Create the title box
|
||||
var info = new L.Control() as any;
|
||||
|
||||
info.onAdd = function() {
|
||||
info.onAdd = function () {
|
||||
this._div = L.DomUtil.create('div', 'info');
|
||||
this.update();
|
||||
return this._div;
|
||||
};
|
||||
|
||||
info.update = function() {
|
||||
info.update = function () {
|
||||
this._div.innerHTML = `<h4 style="font-weight: 600; background: #f9f9f9; padding: 5px; border-radius: 5px; color: #464646;">${title}</h4>`;
|
||||
};
|
||||
|
||||
if (title) info.addTo(map.target);
|
||||
if(!autoZoomConfiguration) return;
|
||||
if (!autoZoomConfiguration) return;
|
||||
|
||||
let layerToZoomBounds = L.latLngBounds(L.latLng(0, 0), L.latLng(0, 0));
|
||||
|
||||
layers.forEach((layer) => {
|
||||
if(layer.name === autoZoomConfiguration.layerName) {
|
||||
if (layer.name === autoZoomConfiguration.layerName) {
|
||||
const data = layersData.find(
|
||||
(layerData) => layerData.name === layer.name
|
||||
)?.data;
|
||||
@@ -142,10 +270,8 @@ export function Map({
|
||||
map.target.fitBounds(layerToZoomBounds);
|
||||
}}
|
||||
>
|
||||
<TileLayer
|
||||
attribution='© <a href="https://www.openstreetmap.org/copyright">OpenStreetMap</a> contributors'
|
||||
url="https://{s}.tile.openstreetmap.org/{z}/{x}/{y}.png"
|
||||
/>
|
||||
{tileLayerData.url && <TileLayer {...tileLayerData} />}
|
||||
|
||||
<LayersControl position="bottomright">
|
||||
{layers.map((layer) => {
|
||||
const data = layersData.find(
|
||||
|
||||
@@ -1,22 +1,24 @@
|
||||
// Core viewer
|
||||
import { Viewer, Worker, SpecialZoomLevel } from '@react-pdf-viewer/core';
|
||||
import { defaultLayoutPlugin } from '@react-pdf-viewer/default-layout';
|
||||
import { Data } from '../types/properties';
|
||||
|
||||
// Import styles
|
||||
import '@react-pdf-viewer/core/lib/styles/index.css';
|
||||
import '@react-pdf-viewer/default-layout/lib/styles/index.css';
|
||||
|
||||
export interface PdfViewerProps {
|
||||
url: string;
|
||||
data: Required<Pick<Data, 'url'>>;
|
||||
layout: boolean;
|
||||
parentClassName?: string;
|
||||
}
|
||||
|
||||
export function PdfViewer({
|
||||
url,
|
||||
data,
|
||||
layout = false,
|
||||
parentClassName,
|
||||
parentClassName = 'h-screen',
|
||||
}: PdfViewerProps) {
|
||||
const url = data.url;
|
||||
const defaultLayoutPluginInstance = defaultLayoutPlugin();
|
||||
return (
|
||||
<Worker workerUrl="https://unpkg.com/pdfjs-dist@2.15.349/build/pdf.worker.js">
|
||||
|
||||
@@ -1,7 +1,8 @@
|
||||
import { QueryClient, QueryClientProvider, useQuery } from "react-query";
|
||||
import { Plotly } from "./Plotly";
|
||||
import Papa, { ParseConfig } from "papaparse";
|
||||
import LoadingSpinner from "./LoadingSpinner";
|
||||
import { QueryClient, QueryClientProvider, useQuery } from 'react-query';
|
||||
import { Plotly } from './Plotly';
|
||||
import Papa, { ParseConfig } from 'papaparse';
|
||||
import LoadingSpinner from './LoadingSpinner';
|
||||
import { Data } from '../types/properties';
|
||||
|
||||
const queryClient = new QueryClient();
|
||||
|
||||
@@ -17,7 +18,7 @@ async function getCsv(url: string, bytes: number) {
|
||||
|
||||
async function parseCsv(
|
||||
file: string,
|
||||
parsingConfig: ParseConfig,
|
||||
parsingConfig: ParseConfig
|
||||
): Promise<any> {
|
||||
return new Promise((resolve, reject) => {
|
||||
Papa.parse(file, {
|
||||
@@ -39,43 +40,40 @@ async function parseCsv(
|
||||
}
|
||||
|
||||
export interface PlotlyBarChartProps {
|
||||
url?: string;
|
||||
data?: { [key: string]: number | string }[];
|
||||
rawCsv?: string;
|
||||
randomId?: number;
|
||||
data: Data;
|
||||
uniqueId?: number;
|
||||
bytes?: number;
|
||||
parsingConfig?: ParseConfig;
|
||||
xAxis: string;
|
||||
yAxis: string;
|
||||
lineLabel?: string;
|
||||
// TODO: commented out because this doesn't work. I believe
|
||||
// this would only make any difference on charts with multiple
|
||||
// traces.
|
||||
// lineLabel?: string;
|
||||
title?: string;
|
||||
}
|
||||
|
||||
export const PlotlyBarChart: React.FC<PlotlyBarChartProps> = ({
|
||||
url,
|
||||
data,
|
||||
rawCsv,
|
||||
bytes = 5132288,
|
||||
parsingConfig = {},
|
||||
xAxis,
|
||||
yAxis,
|
||||
lineLabel,
|
||||
title = "",
|
||||
// lineLabel,
|
||||
title = '',
|
||||
}) => {
|
||||
const randomId = Math.random();
|
||||
const uniqueId = Math.random();
|
||||
return (
|
||||
// Provide the client to your App
|
||||
<QueryClientProvider client={queryClient}>
|
||||
<PlotlyBarChartInner
|
||||
url={url}
|
||||
data={data}
|
||||
rawCsv={rawCsv}
|
||||
randomId={randomId}
|
||||
uniqueId={uniqueId}
|
||||
bytes={bytes}
|
||||
parsingConfig={parsingConfig}
|
||||
xAxis={xAxis}
|
||||
yAxis={yAxis}
|
||||
lineLabel={lineLabel ?? yAxis}
|
||||
// lineLabel={lineLabel ?? yAxis}
|
||||
title={title}
|
||||
/>
|
||||
</QueryClientProvider>
|
||||
@@ -83,30 +81,28 @@ export const PlotlyBarChart: React.FC<PlotlyBarChartProps> = ({
|
||||
};
|
||||
|
||||
const PlotlyBarChartInner: React.FC<PlotlyBarChartProps> = ({
|
||||
url,
|
||||
data,
|
||||
rawCsv,
|
||||
randomId,
|
||||
uniqueId,
|
||||
bytes,
|
||||
parsingConfig,
|
||||
xAxis,
|
||||
yAxis,
|
||||
lineLabel,
|
||||
// lineLabel,
|
||||
title,
|
||||
}) => {
|
||||
if (data) {
|
||||
if (data.values) {
|
||||
return (
|
||||
<div className="w-full" style={{ height: "500px" }}>
|
||||
<div className="w-full" style={{ height: '500px' }}>
|
||||
<Plotly
|
||||
layout={{
|
||||
title,
|
||||
}}
|
||||
data={[
|
||||
{
|
||||
x: data.map((d) => d[xAxis]),
|
||||
y: data.map((d) => d[yAxis]),
|
||||
type: "bar",
|
||||
name: lineLabel,
|
||||
x: data.values.map((d) => d[xAxis]),
|
||||
y: data.values.map((d) => d[yAxis]),
|
||||
type: 'bar',
|
||||
// name: lineLabel,
|
||||
},
|
||||
]}
|
||||
/>
|
||||
@@ -114,18 +110,18 @@ const PlotlyBarChartInner: React.FC<PlotlyBarChartProps> = ({
|
||||
);
|
||||
}
|
||||
const { data: csvString, isLoading: isDownloadingCSV } = useQuery(
|
||||
["dataCsv", url, randomId],
|
||||
() => getCsv(url as string, bytes ?? 5132288),
|
||||
{ enabled: !!url },
|
||||
['dataCsv', data.url, uniqueId],
|
||||
() => getCsv(data.url as string, bytes ?? 5132288),
|
||||
{ enabled: !!data.url }
|
||||
);
|
||||
const { data: parsedData, isLoading: isParsing } = useQuery(
|
||||
["dataPreview", csvString, randomId],
|
||||
['dataPreview', csvString, uniqueId],
|
||||
() =>
|
||||
parseCsv(
|
||||
rawCsv ? (rawCsv as string) : (csvString as string),
|
||||
parsingConfig ?? {},
|
||||
data.csv ? (data.csv as string) : (csvString as string),
|
||||
parsingConfig ?? {}
|
||||
),
|
||||
{ enabled: rawCsv ? true : !!csvString },
|
||||
{ enabled: data.csv ? true : !!csvString }
|
||||
);
|
||||
if (isParsing || isDownloadingCSV)
|
||||
<div className="w-full flex justify-center items-center h-[500px]">
|
||||
@@ -133,7 +129,7 @@ const PlotlyBarChartInner: React.FC<PlotlyBarChartProps> = ({
|
||||
</div>;
|
||||
if (parsedData)
|
||||
return (
|
||||
<div className="w-full" style={{ height: "500px" }}>
|
||||
<div className="w-full" style={{ height: '500px' }}>
|
||||
<Plotly
|
||||
layout={{
|
||||
title,
|
||||
@@ -142,8 +138,8 @@ const PlotlyBarChartInner: React.FC<PlotlyBarChartProps> = ({
|
||||
{
|
||||
x: parsedData.data.map((d: any) => d[xAxis]),
|
||||
y: parsedData.data.map((d: any) => d[yAxis]),
|
||||
type: "bar",
|
||||
name: lineLabel,
|
||||
type: 'bar',
|
||||
// name: lineLabel, TODO: commented out because this doesn't work
|
||||
},
|
||||
]}
|
||||
/>
|
||||
|
||||
@@ -1,7 +1,8 @@
|
||||
import { QueryClient, QueryClientProvider, useQuery } from "react-query";
|
||||
import { Plotly } from "./Plotly";
|
||||
import Papa, { ParseConfig } from "papaparse";
|
||||
import LoadingSpinner from "./LoadingSpinner";
|
||||
import { QueryClient, QueryClientProvider, useQuery } from 'react-query';
|
||||
import { Plotly } from './Plotly';
|
||||
import Papa, { ParseConfig } from 'papaparse';
|
||||
import LoadingSpinner from './LoadingSpinner';
|
||||
import { Data } from '../types/properties';
|
||||
|
||||
const queryClient = new QueryClient();
|
||||
|
||||
@@ -17,7 +18,7 @@ async function getCsv(url: string, bytes: number) {
|
||||
|
||||
async function parseCsv(
|
||||
file: string,
|
||||
parsingConfig: ParseConfig,
|
||||
parsingConfig: ParseConfig
|
||||
): Promise<any> {
|
||||
return new Promise((resolve, reject) => {
|
||||
Papa.parse(file, {
|
||||
@@ -39,38 +40,33 @@ async function parseCsv(
|
||||
}
|
||||
|
||||
export interface PlotlyLineChartProps {
|
||||
url?: string;
|
||||
data?: { [key: string]: number | string }[];
|
||||
rawCsv?: string;
|
||||
randomId?: number;
|
||||
data: Data;
|
||||
bytes?: number;
|
||||
parsingConfig?: ParseConfig;
|
||||
xAxis: string;
|
||||
yAxis: string;
|
||||
lineLabel?: string;
|
||||
title?: string;
|
||||
uniqueId?: number;
|
||||
}
|
||||
|
||||
export const PlotlyLineChart: React.FC<PlotlyLineChartProps> = ({
|
||||
url,
|
||||
data,
|
||||
rawCsv,
|
||||
bytes = 5132288,
|
||||
parsingConfig = {},
|
||||
xAxis,
|
||||
yAxis,
|
||||
lineLabel,
|
||||
title = "",
|
||||
title = '',
|
||||
uniqueId,
|
||||
}) => {
|
||||
const randomId = Math.random();
|
||||
uniqueId = uniqueId ?? Math.random();
|
||||
return (
|
||||
// Provide the client to your App
|
||||
<QueryClientProvider client={queryClient}>
|
||||
<LineChartInner
|
||||
url={url}
|
||||
data={data}
|
||||
rawCsv={rawCsv}
|
||||
randomId={randomId}
|
||||
uniqueId={uniqueId}
|
||||
bytes={bytes}
|
||||
parsingConfig={parsingConfig}
|
||||
xAxis={xAxis}
|
||||
@@ -83,10 +79,8 @@ export const PlotlyLineChart: React.FC<PlotlyLineChartProps> = ({
|
||||
};
|
||||
|
||||
const LineChartInner: React.FC<PlotlyLineChartProps> = ({
|
||||
url,
|
||||
data,
|
||||
rawCsv,
|
||||
randomId,
|
||||
uniqueId,
|
||||
bytes,
|
||||
parsingConfig,
|
||||
xAxis,
|
||||
@@ -94,18 +88,22 @@ const LineChartInner: React.FC<PlotlyLineChartProps> = ({
|
||||
lineLabel,
|
||||
title,
|
||||
}) => {
|
||||
if (data) {
|
||||
const values = data.values;
|
||||
const url = data.url;
|
||||
const csv = data.csv;
|
||||
|
||||
if (values) {
|
||||
return (
|
||||
<div className="w-full" style={{ height: "500px" }}>
|
||||
<div className="w-full" style={{ height: '500px' }}>
|
||||
<Plotly
|
||||
layout={{
|
||||
title,
|
||||
}}
|
||||
data={[
|
||||
{
|
||||
x: data.map((d) => d[xAxis]),
|
||||
y: data.map((d) => d[yAxis]),
|
||||
mode: "lines",
|
||||
x: values.map((d) => d[xAxis]),
|
||||
y: values.map((d) => d[yAxis]),
|
||||
mode: 'lines',
|
||||
name: lineLabel,
|
||||
},
|
||||
]}
|
||||
@@ -114,18 +112,18 @@ const LineChartInner: React.FC<PlotlyLineChartProps> = ({
|
||||
);
|
||||
}
|
||||
const { data: csvString, isLoading: isDownloadingCSV } = useQuery(
|
||||
["dataCsv", url, randomId],
|
||||
['dataCsv', url, uniqueId],
|
||||
() => getCsv(url as string, bytes ?? 5132288),
|
||||
{ enabled: !!url },
|
||||
{ enabled: !!url }
|
||||
);
|
||||
const { data: parsedData, isLoading: isParsing } = useQuery(
|
||||
["dataPreview", csvString, randomId],
|
||||
['dataPreview', csvString, uniqueId],
|
||||
() =>
|
||||
parseCsv(
|
||||
rawCsv ? (rawCsv as string) : (csvString as string),
|
||||
parsingConfig ?? {},
|
||||
csv ? (csv as string) : (csvString as string),
|
||||
parsingConfig ?? {}
|
||||
),
|
||||
{ enabled: rawCsv ? true : !!csvString },
|
||||
{ enabled: csv ? true : !!csvString }
|
||||
);
|
||||
if (isParsing || isDownloadingCSV)
|
||||
<div className="w-full flex justify-center items-center h-[500px]">
|
||||
@@ -133,7 +131,7 @@ const LineChartInner: React.FC<PlotlyLineChartProps> = ({
|
||||
</div>;
|
||||
if (parsedData)
|
||||
return (
|
||||
<div className="w-full" style={{ height: "500px" }}>
|
||||
<div className="w-full" style={{ height: '500px' }}>
|
||||
<Plotly
|
||||
layout={{
|
||||
title,
|
||||
@@ -142,7 +140,7 @@ const LineChartInner: React.FC<PlotlyLineChartProps> = ({
|
||||
{
|
||||
x: parsedData.data.map((d: any) => d[xAxis]),
|
||||
y: parsedData.data.map((d: any) => d[yAxis]),
|
||||
mode: "lines",
|
||||
mode: 'lines',
|
||||
name: lineLabel,
|
||||
},
|
||||
]}
|
||||
|
||||
@@ -1,6 +1,7 @@
|
||||
// Wrapper for the Vega component
|
||||
import { Vega as VegaOg } from "react-vega";
|
||||
import { VegaProps } from "react-vega/lib/Vega";
|
||||
|
||||
export function Vega(props) {
|
||||
export function Vega(props: VegaProps) {
|
||||
return <VegaOg {...props} />;
|
||||
}
|
||||
|
||||
@@ -1,8 +1,9 @@
|
||||
// Wrapper for the Vega Lite component
|
||||
import { VegaLite as VegaLiteOg } from "react-vega";
|
||||
import applyFullWidthDirective from "../lib/applyFullWidthDirective";
|
||||
import { VegaLite as VegaLiteOg } from 'react-vega';
|
||||
import { VegaLiteProps } from 'react-vega/lib/VegaLite';
|
||||
import applyFullWidthDirective from '../lib/applyFullWidthDirective';
|
||||
|
||||
export function VegaLite(props) {
|
||||
export function VegaLite(props: VegaLiteProps) {
|
||||
const Component = applyFullWidthDirective({ Component: VegaLiteOg });
|
||||
|
||||
return <Component {...props} />;
|
||||
|
||||
@@ -1,15 +1,17 @@
|
||||
export * from './components/Table';
|
||||
export * from './components/Catalog';
|
||||
export * from './components/LineChart';
|
||||
export * from './components/Vega';
|
||||
export * from './components/VegaLite';
|
||||
export * from './components/FlatUiTable';
|
||||
export * from './components/OpenLayers/OpenLayers';
|
||||
export * from './components/Map';
|
||||
export * from './components/PdfViewer';
|
||||
export * from "./components/Excel";
|
||||
export * from "./components/BucketViewer";
|
||||
export * from "./components/Iframe";
|
||||
export * from "./components/Plotly";
|
||||
export * from "./components/PlotlyLineChart";
|
||||
export * from "./components/PlotlyBarChart";
|
||||
// NOTE: components that are hidden for now
|
||||
// TODO: deprecate those components?
|
||||
// export * from './components/Table';
|
||||
// export * from "./components/BucketViewer";
|
||||
// export * from './components/OpenLayers/OpenLayers';
|
||||
|
||||
1211
packages/components/src/lib/tileLayerPresets.tsx
Normal file
1211
packages/components/src/lib/tileLayerPresets.tsx
Normal file
File diff suppressed because it is too large
Load Diff
18
packages/components/src/types/properties.ts
Normal file
18
packages/components/src/types/properties.ts
Normal file
@@ -0,0 +1,18 @@
|
||||
/*
|
||||
* All components should use this interface for
|
||||
* its data property.
|
||||
* Based on vega.
|
||||
*
|
||||
*/
|
||||
|
||||
type URL = string; // Just in case we want to transform it into an object with configurations
|
||||
export interface Data {
|
||||
url?: URL;
|
||||
values?: { [key: string]: number | string }[];
|
||||
csv?: string;
|
||||
}
|
||||
|
||||
export interface GeospatialData {
|
||||
url?: URL;
|
||||
geojson?: GeoJSON.GeoJSON;
|
||||
}
|
||||
@@ -1,74 +0,0 @@
|
||||
import type { Meta, StoryObj } from '@storybook/react';
|
||||
|
||||
import { PlotlyBarChart, PlotlyBarChartProps } from '../src/components/PlotlyBarChart';
|
||||
|
||||
// More on how to set up stories at: https://storybook.js.org/docs/react/writing-stories/introduction
|
||||
const meta: Meta = {
|
||||
title: 'Components/PlotlyBarChart',
|
||||
component: PlotlyBarChart,
|
||||
tags: ['autodocs'],
|
||||
argTypes: {
|
||||
url: {
|
||||
description:
|
||||
'CSV Url to be parsed and used as data source',
|
||||
},
|
||||
data: {
|
||||
description:
|
||||
'Data to be displayed. as an array of key value pairs \n\n E.g.: [{ year: 1850, temperature: -0.41765878 }, { year: 1851, temperature: -0.2333498 }, ...]',
|
||||
},
|
||||
rawCsv: {
|
||||
description:
|
||||
'Raw csv data to be parsed and used as data source',
|
||||
},
|
||||
bytes: {
|
||||
description:
|
||||
'How many bytes to read from the url',
|
||||
},
|
||||
parsingConfig: {
|
||||
description: 'If using url or rawCsv, this parsing config will be used to parse the data. Optional, check https://www.papaparse.com/ for more info',
|
||||
},
|
||||
title: {
|
||||
description: 'Title to display on the chart. Optional.',
|
||||
},
|
||||
lineLabel: {
|
||||
description: 'Label to display on the line, Optional, will use yAxis if not provided',
|
||||
},
|
||||
xAxis: {
|
||||
description:
|
||||
'Name of the X axis on the data. Required when the "data" parameter is an URL.',
|
||||
},
|
||||
yAxis: {
|
||||
description:
|
||||
'Name of the Y axis on the data. Required when the "data" parameter is an URL.',
|
||||
},
|
||||
},
|
||||
};
|
||||
|
||||
export default meta;
|
||||
|
||||
type Story = StoryObj<PlotlyBarChartProps>;
|
||||
|
||||
export const FromDataPoints: Story = {
|
||||
name: 'Line chart from array of data points',
|
||||
args: {
|
||||
data: [
|
||||
{year: '1850', temperature: -0.41765878},
|
||||
{year: '1851', temperature: -0.2333498},
|
||||
{year: '1852', temperature: -0.22939907},
|
||||
{year: '1853', temperature: -0.27035445},
|
||||
{year: '1854', temperature: -0.29163003},
|
||||
],
|
||||
xAxis: 'year',
|
||||
yAxis: 'temperature',
|
||||
},
|
||||
};
|
||||
|
||||
export const FromURL: Story = {
|
||||
name: 'Line chart from URL',
|
||||
args: {
|
||||
title: 'Apple Stock Prices',
|
||||
url: 'https://raw.githubusercontent.com/plotly/datasets/master/finance-charts-apple.csv',
|
||||
xAxis: 'Date',
|
||||
yAxis: 'AAPL.Open',
|
||||
},
|
||||
};
|
||||
@@ -1,3 +1,6 @@
|
||||
// NOTE: this component was renamed with .bkp so that it's hidden
|
||||
// from the Storybook app
|
||||
|
||||
import { type Meta, type StoryObj } from '@storybook/react';
|
||||
|
||||
import {
|
||||
@@ -10,11 +10,14 @@ const meta: Meta = {
|
||||
argTypes: {
|
||||
datasets: {
|
||||
description:
|
||||
'Lists of datasets to be displayed in the list, will usually be automatically available',
|
||||
"Array of items to be displayed on the searchable list. Must have the following properties: \n\n \
|
||||
`_id`: item's unique id \n\n \
|
||||
`url_path`: href of the item \n\n \
|
||||
`metadata`: object with a `title` property, that will be displayed as the title of the item, together with any other custom fields that might or not be faceted.",
|
||||
},
|
||||
facets: {
|
||||
description:
|
||||
'List of frontmatter fields that should be used as filters, needs to match exactly with the field name',
|
||||
"Array of strings, which are name of properties in the datasets' `metadata`, which are going to be faceted.",
|
||||
},
|
||||
},
|
||||
};
|
||||
@@ -31,99 +34,35 @@ export const WithoutFacets: Story = {
|
||||
{
|
||||
_id: '07026b22d49916754df1dc8ffb9ccd1c31878aae',
|
||||
url_path: 'dataset-4',
|
||||
file_path: 'content/dataset-4/index.md',
|
||||
metadata: {
|
||||
title: 'Detecting Abusive Albanian',
|
||||
'link-to-publication': 'https://arxiv.org/abs/2107.13592',
|
||||
'link-to-data': 'https://doi.org/10.6084/m9.figshare.19333298.v1',
|
||||
'task-description':
|
||||
'Hierarchical (offensive/not; untargeted/targeted; person/group/other)',
|
||||
'details-of-task':
|
||||
'Detect and categorise abusive language in social media data',
|
||||
'size-of-dataset': 11874,
|
||||
'percentage-abusive': 13.2,
|
||||
language: 'Albanian',
|
||||
'level-of-annotation': ['Posts'],
|
||||
platform: ['Instagram', 'Youtube'],
|
||||
medium: ['Text'],
|
||||
reference:
|
||||
'Nurce, E., Keci, J., Derczynski, L., 2021. Detecting Abusive Albanian. arXiv:2107.13592',
|
||||
},
|
||||
},
|
||||
{
|
||||
_id: '42c86cf3c4fbbab11d91c2a7d6dcb8f750bc4e19',
|
||||
url_path: 'dataset-1',
|
||||
file_path: 'content/dataset-1/index.md',
|
||||
metadata: {
|
||||
title: 'AbuseEval v1.0',
|
||||
'link-to-publication':
|
||||
'http://www.lrec-conf.org/proceedings/lrec2020/pdf/2020.lrec-1.760.pdf',
|
||||
'link-to-data': 'https://github.com/tommasoc80/AbuseEval',
|
||||
'task-description':
|
||||
'Explicitness annotation of offensive and abusive content',
|
||||
'details-of-task':
|
||||
'Enriched versions of the OffensEval/OLID dataset with the distinction of explicit/implicit offensive messages and the new dimension for abusive messages. Labels for offensive language: EXPLICIT, IMPLICT, NOT; Labels for abusive language: EXPLICIT, IMPLICT, NOTABU',
|
||||
'size-of-dataset': 14100,
|
||||
'percentage-abusive': 20.75,
|
||||
language: 'English',
|
||||
'level-of-annotation': ['Tweets'],
|
||||
platform: ['Twitter'],
|
||||
medium: ['Text'],
|
||||
reference:
|
||||
'Caselli, T., Basile, V., Jelena, M., Inga, K., and Michael, G. 2020. "I feel offended, don’t be abusive! implicit/explicit messages in offensive and abusive language". The 12th Language Resources and Evaluation Conference (pp. 6193-6202). European Language Resources Association.',
|
||||
},
|
||||
},
|
||||
{
|
||||
_id: '80001dd32a752421fdcc64e91fbd237dc31d6bb3',
|
||||
url_path: 'dataset-2',
|
||||
file_path: 'content/dataset-2/index.md',
|
||||
metadata: {
|
||||
title:
|
||||
'Abusive Language Detection on Arabic Social Media (Al Jazeera)',
|
||||
'link-to-publication': 'https://www.aclweb.org/anthology/W17-3008',
|
||||
'link-to-data':
|
||||
'http://alt.qcri.org/~hmubarak/offensive/AJCommentsClassification-CF.xlsx',
|
||||
'task-description':
|
||||
'Ternary (Obscene, Offensive but not obscene, Clean)',
|
||||
'details-of-task': 'Incivility',
|
||||
'size-of-dataset': 32000,
|
||||
'percentage-abusive': 0.81,
|
||||
language: 'Arabic',
|
||||
'level-of-annotation': ['Posts'],
|
||||
platform: ['AlJazeera'],
|
||||
medium: ['Text'],
|
||||
reference:
|
||||
'Mubarak, H., Darwish, K. and Magdy, W., 2017. Abusive Language Detection on Arabic Social Media. In: Proceedings of the First Workshop on Abusive Language Online. Vancouver, Canada: Association for Computational Linguistics, pp.52-56.',
|
||||
},
|
||||
},
|
||||
{
|
||||
_id: '96649d05d8193f4333b10015af76c6562971bd8c',
|
||||
url_path: 'dataset-3',
|
||||
file_path: 'content/dataset-3/index.md',
|
||||
metadata: {
|
||||
title: 'CoRAL: a Context-aware Croatian Abusive Language Dataset',
|
||||
'link-to-publication':
|
||||
'https://aclanthology.org/2022.findings-aacl.21/',
|
||||
'link-to-data':
|
||||
'https://github.com/shekharRavi/CoRAL-dataset-Findings-of-the-ACL-AACL-IJCNLP-2022',
|
||||
'task-description':
|
||||
'Multi-class based on context dependency categories (CDC)',
|
||||
'details-of-task': 'Detectioning CDC from abusive comments',
|
||||
'size-of-dataset': 2240,
|
||||
'percentage-abusive': 100,
|
||||
language: 'Croatian',
|
||||
'level-of-annotation': ['Posts'],
|
||||
platform: ['Posts'],
|
||||
medium: ['Newspaper Comments'],
|
||||
reference:
|
||||
'Ravi Shekhar, Mladen Karan and Matthew Purver (2022). CoRAL: a Context-aware Croatian Abusive Language Dataset. Findings of the ACL: AACL-IJCNLP.',
|
||||
},
|
||||
},
|
||||
],
|
||||
},
|
||||
};
|
||||
;
|
||||
|
||||
export const WithFacets: Story = {
|
||||
name: 'Catalog with facets',
|
||||
args: {
|
||||
@@ -131,7 +70,6 @@ export const WithFacets: Story = {
|
||||
{
|
||||
_id: '07026b22d49916754df1dc8ffb9ccd1c31878aae',
|
||||
url_path: 'dataset-4',
|
||||
file_path: 'content/dataset-4/index.md',
|
||||
metadata: {
|
||||
title: 'Detecting Abusive Albanian',
|
||||
'link-to-publication': 'https://arxiv.org/abs/2107.13592',
|
||||
@@ -220,7 +158,6 @@ export const WithFacets: Story = {
|
||||
},
|
||||
},
|
||||
],
|
||||
facets: ['language', 'platform']
|
||||
facets: ['language', 'platform'],
|
||||
},
|
||||
};
|
||||
;
|
||||
|
||||
@@ -4,13 +4,13 @@ import { Excel, ExcelProps } from '../src/components/Excel';
|
||||
|
||||
// More on how to set up stories at: https://storybook.js.org/docs/react/writing-stories/introduction
|
||||
const meta: Meta = {
|
||||
title: 'Components/Excel',
|
||||
title: 'Components/Tabular/Excel',
|
||||
component: Excel,
|
||||
tags: ['autodocs'],
|
||||
argTypes: {
|
||||
url: {
|
||||
data: {
|
||||
description:
|
||||
'Url of the file to be displayed e.g.: "https://url.to/data.csv"',
|
||||
'Object with a `url` property pointing to the Excel file to be displayed, e.g.: `{ url: "https://url.to/data.csv" }`',
|
||||
},
|
||||
},
|
||||
};
|
||||
@@ -22,13 +22,17 @@ type Story = StoryObj<ExcelProps>;
|
||||
export const SingleSheet: Story = {
|
||||
name: 'Excel file with just one sheet',
|
||||
args: {
|
||||
data: {
|
||||
url: 'https://sheetjs.com/pres.xlsx',
|
||||
},
|
||||
},
|
||||
};
|
||||
|
||||
export const MultipleSheet: Story = {
|
||||
name: 'Excel file with multiple sheets',
|
||||
args: {
|
||||
data: {
|
||||
url: 'https://storage.portaljs.org/IC-Gantt-Chart-Project-Template-8857.xlsx',
|
||||
},
|
||||
},
|
||||
};
|
||||
|
||||
@@ -4,29 +4,31 @@ import { FlatUiTable, FlatUiTableProps } from '../src/components/FlatUiTable';
|
||||
|
||||
// More on how to set up stories at: https://storybook.js.org/docs/react/writing-stories/introduction
|
||||
const meta: Meta = {
|
||||
title: 'Components/FlatUiTable',
|
||||
title: 'Components/Tabular/FlatUiTable',
|
||||
component: FlatUiTable,
|
||||
tags: ['autodocs'],
|
||||
argTypes: {
|
||||
data: {
|
||||
description:
|
||||
'Data to be displayed in the table, must be setup as an array of key value pairs',
|
||||
},
|
||||
csv: {
|
||||
description: 'CSV data as string.',
|
||||
},
|
||||
url: {
|
||||
description:
|
||||
'Fetch the data from a CSV file remotely. only the first 5MB of data will be displayed',
|
||||
'Data to be displayed. \n\n \
|
||||
Must be an object with one of the following properties: `url`, `values` or `csv` \n\n \
|
||||
`url`: URL pointing to a CSV file. \n\n \
|
||||
`values`: array of objects. \n\n \
|
||||
`csv`: string with valid CSV. \n\n \
|
||||
',
|
||||
},
|
||||
bytes: {
|
||||
description:
|
||||
'Fetch the data from a CSV file remotely. only the first <bytes> of data will be displayed',
|
||||
'Fetch the data from a CSV file remotely. Only the first <bytes> of data will be displayed. Defaults to 5MB.',
|
||||
},
|
||||
parsingConfig: {
|
||||
description:
|
||||
'Configuration for parsing the CSV data. See https://www.papaparse.com/docs#config for more details',
|
||||
},
|
||||
uniqueId: {
|
||||
description:
|
||||
'Provide a unique ID to help with cache revalidation of the fetched data.',
|
||||
},
|
||||
},
|
||||
};
|
||||
|
||||
@@ -36,9 +38,10 @@ type Story = StoryObj<FlatUiTableProps>;
|
||||
|
||||
// More on writing stories with args: https://storybook.js.org/docs/react/writing-stories/args
|
||||
export const FromColumnsAndData: Story = {
|
||||
name: 'Table data',
|
||||
name: 'Table from array or objects',
|
||||
args: {
|
||||
data: [
|
||||
data: {
|
||||
values: [
|
||||
{ id: 1, lastName: 'Snow', firstName: 'Jon', age: 35 },
|
||||
{ id: 2, lastName: 'Lannister', firstName: 'Cersei', age: 42 },
|
||||
{ id: 3, lastName: 'Lannister', firstName: 'Jaime', age: 45 },
|
||||
@@ -48,22 +51,27 @@ export const FromColumnsAndData: Story = {
|
||||
{ id: 9, lastName: 'Roxie', firstName: 'Harvey', age: 65 },
|
||||
],
|
||||
},
|
||||
},
|
||||
};
|
||||
|
||||
export const FromRawCSV: Story = {
|
||||
name: 'Table from raw CSV',
|
||||
name: 'Table from inline CSV',
|
||||
args: {
|
||||
rawCsv: `
|
||||
data: {
|
||||
csv: `
|
||||
Year,Temp Anomaly
|
||||
1850,-0.418
|
||||
2020,0.923
|
||||
`,
|
||||
},
|
||||
},
|
||||
};
|
||||
|
||||
export const FromURL: Story = {
|
||||
name: 'Table from URL',
|
||||
args: {
|
||||
data: {
|
||||
url: 'https://storage.openspending.org/alberta-budget/__os_imported__alberta_total.csv',
|
||||
},
|
||||
},
|
||||
};
|
||||
|
||||
@@ -3,17 +3,17 @@ import { type Meta, type StoryObj } from '@storybook/react';
|
||||
import { Iframe, IframeProps } from '../src/components/Iframe';
|
||||
|
||||
const meta: Meta = {
|
||||
title: 'Components/Iframe',
|
||||
title: 'Components/Embedding/Iframe',
|
||||
component: Iframe,
|
||||
tags: ['autodocs'],
|
||||
argTypes: {
|
||||
url: {
|
||||
data: {
|
||||
description:
|
||||
'Page to display inside of the component',
|
||||
'Object with a `url` property pointing to the page to be embeded.',
|
||||
},
|
||||
style: {
|
||||
description:
|
||||
'Style of the component',
|
||||
'Style object of the component. See example at https://react.dev/learn#displaying-data. Defaults to `{ width: "100%", height: "100%" }`',
|
||||
},
|
||||
},
|
||||
};
|
||||
@@ -25,7 +25,9 @@ type Story = StoryObj<IframeProps>;
|
||||
export const Normal: Story = {
|
||||
name: 'Iframe',
|
||||
args: {
|
||||
data: {
|
||||
url: 'https://app.powerbi.com/view?r=eyJrIjoiYzBmN2Q2MzYtYzE3MS00ODkxLWE5OWMtZTQ2MjBlMDljMDk4IiwidCI6Ijk1M2IwZjgzLTFjZTYtNDVjMy04MmM5LTFkODQ3ZTM3MjMzOSIsImMiOjh9',
|
||||
style: {width: `100%`, height: `100%`}
|
||||
},
|
||||
style: { width: `100%`, height: `600px` },
|
||||
},
|
||||
};
|
||||
|
||||
@@ -4,37 +4,40 @@ import { LineChart, LineChartProps } from '../src/components/LineChart';
|
||||
|
||||
// More on how to set up stories at: https://storybook.js.org/docs/react/writing-stories/introduction
|
||||
const meta: Meta = {
|
||||
title: 'Components/LineChart',
|
||||
title: 'Components/Charts/LineChart',
|
||||
component: LineChart,
|
||||
tags: ['autodocs'],
|
||||
argTypes: {
|
||||
data: {
|
||||
description:
|
||||
'Data to be displayed.\n\n E.g.: [["1990", 1], ["1991", 2]] \n\nOR\n\n "https://url.to/data.csv"',
|
||||
'Data to be displayed. \n\n \
|
||||
Must be an object with one of the following properties: `url` or `values` \n\n \
|
||||
`url`: URL pointing to a CSV file. \n\n \
|
||||
`values`: array of objects \n\n',
|
||||
},
|
||||
title: {
|
||||
description: 'Title to display on the chart. Optional.',
|
||||
description: 'Title to display on the chart.',
|
||||
},
|
||||
xAxis: {
|
||||
description:
|
||||
'Name of the X axis on the data. Required when the "data" parameter is an URL.',
|
||||
'Name of the column header or object property that represents the X-axis on the data.',
|
||||
},
|
||||
xAxisType: {
|
||||
description: 'Type of the X axis',
|
||||
description: 'Type of the X-axis.',
|
||||
},
|
||||
xAxisTimeUnit: {
|
||||
description: 'Time unit of the X axis (optional)',
|
||||
description: 'Time unit of the X-axis, in case its type is `temporal.`',
|
||||
},
|
||||
yAxis: {
|
||||
description:
|
||||
'Name of the Y axis on the data. Required when the "data" parameter is an URL.',
|
||||
'Name of the column headers or object properties that represent the Y-axis on the data.',
|
||||
},
|
||||
yAxisType: {
|
||||
description: 'Type of the Y axis',
|
||||
description: 'Type of the Y-axis',
|
||||
},
|
||||
fullWidth: {
|
||||
symbol: {
|
||||
description:
|
||||
'Whether the component should be rendered as full bleed or not',
|
||||
'Name of the column header or object property that represents a series for multiple series.',
|
||||
},
|
||||
},
|
||||
};
|
||||
@@ -47,21 +50,72 @@ type Story = StoryObj<LineChartProps>;
|
||||
export const FromDataPoints: Story = {
|
||||
name: 'Line chart from array of data points',
|
||||
args: {
|
||||
data: [
|
||||
['1850', -0.41765878],
|
||||
['1851', -0.2333498],
|
||||
['1852', -0.22939907],
|
||||
['1853', -0.27035445],
|
||||
['1854', -0.29163003],
|
||||
data: {
|
||||
values: [
|
||||
{ year: '1850', value: -0.41765878 },
|
||||
{ year: '1851', value: -0.2333498 },
|
||||
{ year: '1852', value: -0.22939907 },
|
||||
{ year: '1853', value: -0.27035445 },
|
||||
{ year: '1854', value: -0.29163003 },
|
||||
],
|
||||
},
|
||||
xAxis: 'year',
|
||||
yAxis: 'value',
|
||||
},
|
||||
};
|
||||
|
||||
export const MultiSeries: Story = {
|
||||
name: 'Line chart with multiple series (specifying symbol)',
|
||||
args: {
|
||||
data: {
|
||||
values: [
|
||||
{ year: '1850', value: -0.41765878, z: 'A' },
|
||||
{ year: '1851', value: -0.2333498, z: 'A' },
|
||||
{ year: '1852', value: -0.22939907, z: 'A' },
|
||||
{ year: '1853', value: -0.27035445, z: 'A' },
|
||||
{ year: '1854', value: -0.29163003, z: 'A' },
|
||||
{ year: '1850', value: -0.42993882, z: 'B' },
|
||||
{ year: '1851', value: -0.30365549, z: 'B' },
|
||||
{ year: '1852', value: -0.27905189, z: 'B' },
|
||||
{ year: '1853', value: -0.22939704, z: 'B' },
|
||||
{ year: '1854', value: -0.25688013, z: 'B' },
|
||||
{ year: '1850', value: -0.4757164, z: 'C' },
|
||||
{ year: '1851', value: -0.41971018, z: 'C' },
|
||||
{ year: '1852', value: -0.40724799, z: 'C' },
|
||||
{ year: '1853', value: -0.45049156, z: 'C' },
|
||||
{ year: '1854', value: -0.41896583, z: 'C' },
|
||||
],
|
||||
},
|
||||
xAxis: 'year',
|
||||
yAxis: 'value',
|
||||
symbol: 'z',
|
||||
},
|
||||
};
|
||||
|
||||
export const MultiColumns: Story = {
|
||||
name: 'Line chart with multiple series (with multiple columns)',
|
||||
args: {
|
||||
data: {
|
||||
values: [
|
||||
{ year: '1850', A: -0.41765878, B: -0.42993882, C: -0.4757164 },
|
||||
{ year: '1851', A: -0.2333498, B: -0.30365549, C: -0.41971018 },
|
||||
{ year: '1852', A: -0.22939907, B: -0.27905189, C: -0.40724799 },
|
||||
{ year: '1853', A: -0.27035445, B: -0.22939704, C: -0.45049156 },
|
||||
{ year: '1854', A: -0.29163003, B: -0.25688013, C: -0.41896583 },
|
||||
],
|
||||
},
|
||||
xAxis: 'year',
|
||||
yAxis: ['A', 'B', 'C'],
|
||||
},
|
||||
};
|
||||
|
||||
export const FromURL: Story = {
|
||||
name: 'Line chart from URL',
|
||||
args: {
|
||||
data: {
|
||||
url: 'https://raw.githubusercontent.com/datasets/oil-prices/main/data/wti-year.csv',
|
||||
},
|
||||
title: 'Oil Price x Year',
|
||||
data: 'https://raw.githubusercontent.com/datasets/oil-prices/main/data/wti-year.csv',
|
||||
xAxis: 'Date',
|
||||
yAxis: 'Price',
|
||||
},
|
||||
|
||||
@@ -1,74 +0,0 @@
|
||||
import type { Meta, StoryObj } from '@storybook/react';
|
||||
|
||||
import { PlotlyLineChart, PlotlyLineChartProps } from '../src/components/PlotlyLineChart';
|
||||
|
||||
// More on how to set up stories at: https://storybook.js.org/docs/react/writing-stories/introduction
|
||||
const meta: Meta = {
|
||||
title: 'Components/PlotlyLineChart',
|
||||
component: PlotlyLineChart,
|
||||
tags: ['autodocs'],
|
||||
argTypes: {
|
||||
url: {
|
||||
description:
|
||||
'CSV Url to be parsed and used as data source',
|
||||
},
|
||||
data: {
|
||||
description:
|
||||
'Data to be displayed. as an array of key value pairs \n\n E.g.: [{ year: 1850, temperature: -0.41765878 }, { year: 1851, temperature: -0.2333498 }, ...]',
|
||||
},
|
||||
rawCsv: {
|
||||
description:
|
||||
'Raw csv data to be parsed and used as data source',
|
||||
},
|
||||
bytes: {
|
||||
description:
|
||||
'How many bytes to read from the url',
|
||||
},
|
||||
parsingConfig: {
|
||||
description: 'If using url or rawCsv, this parsing config will be used to parse the data. Optional, check https://www.papaparse.com/ for more info',
|
||||
},
|
||||
title: {
|
||||
description: 'Title to display on the chart. Optional.',
|
||||
},
|
||||
lineLabel: {
|
||||
description: 'Label to display on the line, Optional, will use yAxis if not provided',
|
||||
},
|
||||
xAxis: {
|
||||
description:
|
||||
'Name of the X axis on the data. Required when the "data" parameter is an URL.',
|
||||
},
|
||||
yAxis: {
|
||||
description:
|
||||
'Name of the Y axis on the data. Required when the "data" parameter is an URL.',
|
||||
},
|
||||
},
|
||||
};
|
||||
|
||||
export default meta;
|
||||
|
||||
type Story = StoryObj<PlotlyLineChartProps>;
|
||||
|
||||
export const FromDataPoints: Story = {
|
||||
name: 'Line chart from array of data points',
|
||||
args: {
|
||||
data: [
|
||||
{year: '1850', temperature: -0.41765878},
|
||||
{year: '1851', temperature: -0.2333498},
|
||||
{year: '1852', temperature: -0.22939907},
|
||||
{year: '1853', temperature: -0.27035445},
|
||||
{year: '1854', temperature: -0.29163003},
|
||||
],
|
||||
xAxis: 'year',
|
||||
yAxis: 'temperature',
|
||||
},
|
||||
};
|
||||
|
||||
export const FromURL: Story = {
|
||||
name: 'Line chart from URL',
|
||||
args: {
|
||||
title: 'Oil Price x Year',
|
||||
url: 'https://raw.githubusercontent.com/datasets/oil-prices/main/data/wti-year.csv',
|
||||
xAxis: 'Date',
|
||||
yAxis: 'Price',
|
||||
},
|
||||
};
|
||||
@@ -4,29 +4,34 @@ import { Map, MapProps } from '../src/components/Map';
|
||||
|
||||
// More on how to set up stories at: https://storybook.js.org/docs/react/writing-stories/introduction
|
||||
const meta: Meta = {
|
||||
title: 'Components/Map',
|
||||
title: 'Components/Geospatial/Map',
|
||||
component: Map,
|
||||
tags: ['autodocs'],
|
||||
argTypes: {
|
||||
layers: {
|
||||
description:
|
||||
'Data to be displayed.\n\n GeoJSON Object \n\nOR\n\n URL to GeoJSON Object',
|
||||
'Array of layers to be displayed on the map. Should be an object with: \n\n \
|
||||
`data`: object with either a `url` property pointing to a GeoJSON file or a `geojson` property with a GeoJSON object. \n\n \
|
||||
`name`: name of the layer. \n\n \
|
||||
`colorscale`: object with a `starting` and `ending` colors that will be used to create a gradient and color the map. \n\n \
|
||||
`tooltip`: `true` to show all available features on the tooltip, object with a `propNames` property as an array of strings to choose which features to display. \n\n',
|
||||
},
|
||||
title: {
|
||||
description: 'Title to display on the map. Optional.',
|
||||
description: 'Title to display on the map.',
|
||||
},
|
||||
center: {
|
||||
description: 'Initial coordinates of the center of the map',
|
||||
},
|
||||
zoom: {
|
||||
description: 'Zoom level',
|
||||
description: 'Initial zoom level',
|
||||
},
|
||||
style: {
|
||||
description: "Styles for the container"
|
||||
description: "CSS styles to be applied to the map's container.",
|
||||
},
|
||||
autoZoomConfiguration: {
|
||||
description: "Configuration to auto zoom in the specified layer data"
|
||||
}
|
||||
description:
|
||||
"Pass a layer's name to automatically zoom to the bounding area of a layer.",
|
||||
},
|
||||
},
|
||||
};
|
||||
|
||||
@@ -38,9 +43,15 @@ type Story = StoryObj<MapProps>;
|
||||
export const GeoJSONPolygons: Story = {
|
||||
name: 'GeoJSON polygons map',
|
||||
args: {
|
||||
tileLayerName:'MapBox',
|
||||
tileLayerOptions:{
|
||||
accessToken : 'pk.eyJ1Ijoid2lsbHktcGFsbWFyZWpvIiwiYSI6ImNqNzk5NmRpNDFzb2cyeG9sc2luMHNjajUifQ.lkoVRFSI8hOLH4uJeOzwXw',
|
||||
},
|
||||
layers: [
|
||||
{
|
||||
data: 'https://d2ad6b4ur7yvpq.cloudfront.net/naturalearth-3.3.0/ne_10m_geography_marine_polys.geojson',
|
||||
data: {
|
||||
url: 'https://d2ad6b4ur7yvpq.cloudfront.net/naturalearth-3.3.0/ne_10m_geography_marine_polys.geojson',
|
||||
},
|
||||
name: 'Polygons',
|
||||
tooltip: { propNames: ['name'] },
|
||||
colorScale: {
|
||||
@@ -60,7 +71,9 @@ export const GeoJSONPoints: Story = {
|
||||
args: {
|
||||
layers: [
|
||||
{
|
||||
data: 'https://opendata.arcgis.com/datasets/9c58741995174fbcb017cf46c8a42f4b_25.geojson',
|
||||
data: {
|
||||
url: 'https://opendata.arcgis.com/datasets/9c58741995174fbcb017cf46c8a42f4b_25.geojson',
|
||||
},
|
||||
name: 'Points',
|
||||
tooltip: { propNames: ['Location'] },
|
||||
},
|
||||
@@ -76,12 +89,16 @@ export const GeoJSONMultipleLayers: Story = {
|
||||
args: {
|
||||
layers: [
|
||||
{
|
||||
data: 'https://opendata.arcgis.com/datasets/9c58741995174fbcb017cf46c8a42f4b_25.geojson',
|
||||
data: {
|
||||
url: 'https://opendata.arcgis.com/datasets/9c58741995174fbcb017cf46c8a42f4b_25.geojson',
|
||||
},
|
||||
name: 'Points',
|
||||
tooltip: true,
|
||||
},
|
||||
{
|
||||
data: 'https://d2ad6b4ur7yvpq.cloudfront.net/naturalearth-3.3.0/ne_10m_geography_marine_polys.geojson',
|
||||
data: {
|
||||
url: 'https://d2ad6b4ur7yvpq.cloudfront.net/naturalearth-3.3.0/ne_10m_geography_marine_polys.geojson',
|
||||
},
|
||||
name: 'Polygons',
|
||||
tooltip: true,
|
||||
colorScale: {
|
||||
@@ -94,19 +111,23 @@ export const GeoJSONMultipleLayers: Story = {
|
||||
center: { latitude: 45, longitude: 0 },
|
||||
zoom: 2,
|
||||
},
|
||||
}
|
||||
};
|
||||
|
||||
export const GeoJSONMultipleLayersWithAutoZoomInSpecifiedLayer: Story = {
|
||||
name: 'GeoJSON polygons and points map with auto zoom in the points layer',
|
||||
args: {
|
||||
layers: [
|
||||
{
|
||||
data: 'https://opendata.arcgis.com/datasets/9c58741995174fbcb017cf46c8a42f4b_25.geojson',
|
||||
data: {
|
||||
url: 'https://opendata.arcgis.com/datasets/9c58741995174fbcb017cf46c8a42f4b_25.geojson',
|
||||
},
|
||||
name: 'Points',
|
||||
tooltip: true,
|
||||
},
|
||||
{
|
||||
data: 'https://d2ad6b4ur7yvpq.cloudfront.net/naturalearth-3.3.0/ne_10m_geography_marine_polys.geojson',
|
||||
data: {
|
||||
url: 'https://d2ad6b4ur7yvpq.cloudfront.net/naturalearth-3.3.0/ne_10m_geography_marine_polys.geojson',
|
||||
},
|
||||
name: 'Polygons',
|
||||
tooltip: true,
|
||||
colorScale: {
|
||||
@@ -119,7 +140,7 @@ export const GeoJSONMultipleLayersWithAutoZoomInSpecifiedLayer: Story = {
|
||||
center: { latitude: 45, longitude: 0 },
|
||||
zoom: 2,
|
||||
autoZoomConfiguration: {
|
||||
layerName: 'Points'
|
||||
}
|
||||
layerName: 'Points',
|
||||
},
|
||||
},
|
||||
};
|
||||
|
||||
@@ -1,3 +1,6 @@
|
||||
// NOTE: this component was renamed with .bkp so that it's hidden
|
||||
// from the Storybook app
|
||||
|
||||
import type { Meta, StoryObj } from '@storybook/react';
|
||||
import React from 'react';
|
||||
import OpenLayers from '../src/components/OpenLayers/OpenLayers';
|
||||
@@ -3,19 +3,21 @@ import type { Meta, StoryObj } from '@storybook/react';
|
||||
import { PdfViewer, PdfViewerProps } from '../src/components/PdfViewer';
|
||||
|
||||
const meta: Meta = {
|
||||
title: 'Components/PdfViewer',
|
||||
title: 'Components/Embedding/PdfViewer',
|
||||
component: PdfViewer,
|
||||
tags: ['autodocs'],
|
||||
argTypes: {
|
||||
url: {
|
||||
description: 'URL to PDF file',
|
||||
data: {
|
||||
description:
|
||||
'Object with a `url` property pointing to the PDF file to be displayed, e.g.: `{ url: "https://cdn.filestackcontent.com/wcrjf9qPTCKXV3hMXDwK" }`.',
|
||||
},
|
||||
parentClassName: {
|
||||
description: 'Classname for the parent div of the pdf viewer',
|
||||
},
|
||||
layour: {
|
||||
description:
|
||||
'Set to true if you want to have a layout with zoom level, page count, printing button etc',
|
||||
'HTML classes to be applied to the container of the PDF viewer. [Tailwind](https://tailwindcss.com/) classes, such as `h-96` to define the height of the component, can be used on this field.',
|
||||
},
|
||||
layout: {
|
||||
description:
|
||||
'Set to `true` if you want to display a layout with zoom level, page count, printing button and other controls.',
|
||||
defaultValue: false,
|
||||
},
|
||||
},
|
||||
@@ -25,26 +27,23 @@ export default meta;
|
||||
|
||||
type Story = StoryObj<PdfViewerProps>;
|
||||
|
||||
export const PdfViewerStory: Story = {
|
||||
name: 'PdfViewer',
|
||||
export const PdfViewerStoryWithoutControlsLayout: Story = {
|
||||
name: 'PDF Viewer without controls layout',
|
||||
args: {
|
||||
data: {
|
||||
url: 'https://cdn.filestackcontent.com/wcrjf9qPTCKXV3hMXDwK',
|
||||
},
|
||||
};
|
||||
|
||||
export const PdfViewerStoryWithLayout: Story = {
|
||||
name: 'PdfViewer with the default layout',
|
||||
args: {
|
||||
url: 'https://cdn.filestackcontent.com/wcrjf9qPTCKXV3hMXDwK',
|
||||
layout: true,
|
||||
},
|
||||
};
|
||||
|
||||
export const PdfViewerStoryWithHeight: Story = {
|
||||
name: 'PdfViewer with a custom height',
|
||||
args: {
|
||||
url: 'https://cdn.filestackcontent.com/wcrjf9qPTCKXV3hMXDwK',
|
||||
parentClassName: 'h-96',
|
||||
layout: true,
|
||||
},
|
||||
};
|
||||
|
||||
export const PdfViewerStoryWithControlsLayout: Story = {
|
||||
name: 'PdfViewer with controls layout',
|
||||
args: {
|
||||
data: {
|
||||
url: 'https://cdn.filestackcontent.com/wcrjf9qPTCKXV3hMXDwK',
|
||||
},
|
||||
layout: true,
|
||||
parentClassName: 'h-96',
|
||||
},
|
||||
};
|
||||
|
||||
@@ -4,9 +4,19 @@ import { Plotly } from '../src/components/Plotly';
|
||||
|
||||
// More on how to set up stories at: https://storybook.js.org/docs/react/writing-stories/introduction
|
||||
const meta: Meta = {
|
||||
title: 'Components/Plotly',
|
||||
title: 'Components/Charts/Plotly',
|
||||
component: Plotly,
|
||||
tags: ['autodocs'],
|
||||
argTypes: {
|
||||
data: {
|
||||
description:
|
||||
"Plotly's `data` prop. You can find references on how to use these props at https://github.com/plotly/react-plotly.js/#basic-props.",
|
||||
},
|
||||
layout: {
|
||||
description:
|
||||
"Plotly's `layout` prop. You can find references on how to use these props at https://github.com/plotly/react-plotly.js/#basic-props.",
|
||||
},
|
||||
},
|
||||
};
|
||||
|
||||
export default meta;
|
||||
@@ -15,7 +25,7 @@ type Story = StoryObj<any>;
|
||||
|
||||
// More on writing stories with args: https://storybook.js.org/docs/react/writing-stories/args
|
||||
export const Primary: Story = {
|
||||
name: 'Chart built with Plotly',
|
||||
name: 'Line chart',
|
||||
args: {
|
||||
data: [
|
||||
{
|
||||
|
||||
102
packages/components/stories/PlotlyBarChart.stories.ts
Normal file
102
packages/components/stories/PlotlyBarChart.stories.ts
Normal file
@@ -0,0 +1,102 @@
|
||||
import type { Meta, StoryObj } from '@storybook/react';
|
||||
|
||||
import {
|
||||
PlotlyBarChart,
|
||||
PlotlyBarChartProps,
|
||||
} from '../src/components/PlotlyBarChart';
|
||||
|
||||
// More on how to set up stories at: https://storybook.js.org/docs/react/writing-stories/introduction
|
||||
const meta: Meta = {
|
||||
title: 'Components/Charts/PlotlyBarChart',
|
||||
component: PlotlyBarChart,
|
||||
tags: ['autodocs'],
|
||||
argTypes: {
|
||||
data: {
|
||||
description:
|
||||
'Data to be displayed. \n\n \
|
||||
Must be an object with one of the following properties: `url`, `values` or `csv` \n\n \
|
||||
`url`: URL pointing to a CSV file. \n\n \
|
||||
`values`: array of objects (check out [this example](/?path=/story/components-plotlybarchart--from-data-points)) \n\n \
|
||||
`csv`: string with valid CSV (check out [this example](/?path=/story/components-plotlybarchart--from-inline-csv)) \n\n \
|
||||
',
|
||||
},
|
||||
bytes: {
|
||||
// TODO: likely this should be an extra option on the data parameter,
|
||||
// specific to URLs
|
||||
description:
|
||||
"How many bytes to read from the url so that the entire file doesn's have to be fetched.",
|
||||
},
|
||||
parsingConfig: {
|
||||
description:
|
||||
'If using URL or CSV, this parsing config will be used to parse the data. Check https://www.papaparse.com/ for more info.',
|
||||
},
|
||||
title: {
|
||||
description: 'Title to display on the chart.',
|
||||
},
|
||||
// TODO: commented out because this doesn't work
|
||||
// lineLabel: {
|
||||
// description:
|
||||
// 'Label to display on the line, Optional, will use yAxis if not provided',
|
||||
// },
|
||||
xAxis: {
|
||||
description:
|
||||
'Name of the column header or object property that represents the X-axis on the data.',
|
||||
},
|
||||
yAxis: {
|
||||
description:
|
||||
'Name of the column header or object property that represents the Y-axis on the data.',
|
||||
},
|
||||
uniqueId: {
|
||||
description: 'Provide a unique ID to help with cache revalidation of the fetched data.'
|
||||
}
|
||||
},
|
||||
};
|
||||
|
||||
export default meta;
|
||||
|
||||
type Story = StoryObj<PlotlyBarChartProps>;
|
||||
|
||||
export const FromDataPoints: Story = {
|
||||
name: 'Bar chart from array of data points',
|
||||
args: {
|
||||
data: {
|
||||
values: [
|
||||
{ year: '1850', temperature: -0.41765878 },
|
||||
{ year: '1851', temperature: -0.2333498 },
|
||||
{ year: '1852', temperature: -0.22939907 },
|
||||
{ year: '1853', temperature: -0.27035445 },
|
||||
{ year: '1854', temperature: -0.29163003 },
|
||||
],
|
||||
},
|
||||
xAxis: 'year',
|
||||
yAxis: 'temperature',
|
||||
},
|
||||
};
|
||||
|
||||
export const FromURL: Story = {
|
||||
name: 'Bar chart from URL',
|
||||
args: {
|
||||
title: 'Apple Stock Prices',
|
||||
data: {
|
||||
url: 'https://raw.githubusercontent.com/plotly/datasets/master/finance-charts-apple.csv',
|
||||
},
|
||||
xAxis: 'Date',
|
||||
yAxis: 'AAPL.Open',
|
||||
},
|
||||
};
|
||||
|
||||
export const FromInlineCSV: Story = {
|
||||
name: 'Bar chart from inline CSV',
|
||||
args: {
|
||||
title: 'Apple Stock Prices',
|
||||
data: {
|
||||
csv: `Date,AAPL.Open,AAPL.High,AAPL.Low,AAPL.Close,AAPL.Volume,AAPL.Adjusted,dn,mavg,up,direction
|
||||
2015-02-17,127.489998,128.880005,126.919998,127.830002,63152400,122.905254,106.7410523,117.9276669,129.1142814,Increasing
|
||||
2015-02-18,127.629997,128.779999,127.449997,128.720001,44891700,123.760965,107.842423,118.9403335,130.0382439,Increasing
|
||||
2015-02-19,128.479996,129.029999,128.330002,128.449997,37362400,123.501363,108.8942449,119.8891668,130.8840887,Decreasing
|
||||
2015-02-20,128.619995,129.5,128.050003,129.5,48948400,124.510914,109.7854494,120.7635001,131.7415509,Increasing`,
|
||||
},
|
||||
xAxis: 'Date',
|
||||
yAxis: 'AAPL.Open',
|
||||
},
|
||||
};
|
||||
101
packages/components/stories/PlotlyLineChart.stories.ts
Normal file
101
packages/components/stories/PlotlyLineChart.stories.ts
Normal file
@@ -0,0 +1,101 @@
|
||||
import type { Meta, StoryObj } from '@storybook/react';
|
||||
|
||||
import {
|
||||
PlotlyLineChart,
|
||||
PlotlyLineChartProps,
|
||||
} from '../src/components/PlotlyLineChart';
|
||||
|
||||
const meta: Meta = {
|
||||
title: 'Components/Charts/PlotlyLineChart',
|
||||
component: PlotlyLineChart,
|
||||
tags: ['autodocs'],
|
||||
argTypes: {
|
||||
data: {
|
||||
description:
|
||||
'Data to be displayed. \n\n \
|
||||
Must be an object with one of the following properties: `url`, `values` or `csv` \n\n \
|
||||
`url`: URL pointing to a CSV file. \n\n \
|
||||
`values`: array of objects. \n\n \
|
||||
`csv`: string with valid CSV. \n\n \
|
||||
',
|
||||
},
|
||||
bytes: {
|
||||
// TODO: likely this should be an extra option on the data parameter,
|
||||
// specific to URLs
|
||||
description:
|
||||
"How many bytes to read from the url so that the entire file doesn's have to be fetched.",
|
||||
},
|
||||
parsingConfig: {
|
||||
description:
|
||||
'If using URL or CSV, this parsing config will be used to parse the data. Check https://www.papaparse.com/ for more info',
|
||||
},
|
||||
title: {
|
||||
description: 'Title to display on the chart.',
|
||||
},
|
||||
lineLabel: {
|
||||
description:
|
||||
'Label to display on the line, will use yAxis if not provided',
|
||||
},
|
||||
xAxis: {
|
||||
description:
|
||||
'Name of the column header or object property that represents the X-axis on the data.',
|
||||
},
|
||||
yAxis: {
|
||||
description:
|
||||
'Name of the column header or object property that represents the Y-axis on the data.',
|
||||
},
|
||||
uniqueId: {
|
||||
description:
|
||||
'Provide a unique ID to help with cache revalidation of the fetched data.',
|
||||
},
|
||||
},
|
||||
};
|
||||
|
||||
export default meta;
|
||||
|
||||
type Story = StoryObj<PlotlyLineChartProps>;
|
||||
|
||||
export const FromDataPoints: Story = {
|
||||
name: 'Line chart from array of data points',
|
||||
args: {
|
||||
data: {
|
||||
values: [
|
||||
{ year: '1850', temperature: -0.41765878 },
|
||||
{ year: '1851', temperature: -0.2333498 },
|
||||
{ year: '1852', temperature: -0.22939907 },
|
||||
{ year: '1853', temperature: -0.27035445 },
|
||||
{ year: '1854', temperature: -0.29163003 },
|
||||
],
|
||||
},
|
||||
xAxis: 'year',
|
||||
yAxis: 'temperature',
|
||||
},
|
||||
};
|
||||
|
||||
export const FromURL: Story = {
|
||||
name: 'Line chart from URL',
|
||||
args: {
|
||||
title: 'Oil Price x Year',
|
||||
data: {
|
||||
url: 'https://raw.githubusercontent.com/datasets/oil-prices/main/data/wti-year.csv',
|
||||
},
|
||||
xAxis: 'Date',
|
||||
yAxis: 'Price',
|
||||
},
|
||||
};
|
||||
|
||||
export const FromInlineCSV: Story = {
|
||||
name: 'Bar chart from inline CSV',
|
||||
args: {
|
||||
title: 'Apple Stock Prices',
|
||||
data: {
|
||||
csv: `Date,AAPL.Open,AAPL.High,AAPL.Low,AAPL.Close,AAPL.Volume,AAPL.Adjusted,dn,mavg,up,direction
|
||||
2015-02-17,127.489998,128.880005,126.919998,127.830002,63152400,122.905254,106.7410523,117.9276669,129.1142814,Increasing
|
||||
2015-02-18,127.629997,128.779999,127.449997,128.720001,44891700,123.760965,107.842423,118.9403335,130.0382439,Increasing
|
||||
2015-02-19,128.479996,129.029999,128.330002,128.449997,37362400,123.501363,108.8942449,119.8891668,130.8840887,Decreasing
|
||||
2015-02-20,128.619995,129.5,128.050003,129.5,48948400,124.510914,109.7854494,120.7635001,131.7415509,Increasing`,
|
||||
},
|
||||
xAxis: 'Date',
|
||||
yAxis: 'AAPL.Open',
|
||||
},
|
||||
};
|
||||
@@ -1,10 +1,13 @@
|
||||
// NOTE: this component was renamed with .bkp so that it's hidden
|
||||
// from the Storybook app
|
||||
|
||||
import type { Meta, StoryObj } from '@storybook/react';
|
||||
|
||||
import { Table, TableProps } from '../src/components/Table';
|
||||
|
||||
// More on how to set up stories at: https://storybook.js.org/docs/react/writing-stories/introduction
|
||||
const meta: Meta = {
|
||||
title: 'Components/Table',
|
||||
title: 'Components/Tabular/Table',
|
||||
component: Table,
|
||||
tags: ['autodocs'],
|
||||
argTypes: {
|
||||
@@ -4,9 +4,19 @@ import { Vega } from '../src/components/Vega';
|
||||
|
||||
// More on how to set up stories at: https://storybook.js.org/docs/react/writing-stories/introduction
|
||||
const meta: Meta = {
|
||||
title: 'Components/Vega',
|
||||
title: 'Components/Charts/Vega',
|
||||
component: Vega,
|
||||
tags: ['autodocs'],
|
||||
argTypes: {
|
||||
data: {
|
||||
description:
|
||||
"Vega's `data` prop. You can find references on how to use this prop at https://vega.github.io/vega/docs/data/",
|
||||
},
|
||||
spec: {
|
||||
description:
|
||||
"Vega's `spec` prop. You can find references on how to use this prop at https://vega.github.io/vega/docs/specification/",
|
||||
},
|
||||
},
|
||||
};
|
||||
|
||||
export default meta;
|
||||
@@ -15,7 +25,7 @@ type Story = StoryObj<any>;
|
||||
|
||||
// More on writing stories with args: https://storybook.js.org/docs/react/writing-stories/args
|
||||
export const Primary: Story = {
|
||||
name: 'Chart built with Vega',
|
||||
name: 'Bar chart',
|
||||
args: {
|
||||
data: {
|
||||
table: [
|
||||
|
||||
@@ -4,7 +4,7 @@ import { VegaLite } from '../src/components/VegaLite';
|
||||
|
||||
// More on how to set up stories at: https://storybook.js.org/docs/react/writing-stories/introduction
|
||||
const meta: Meta = {
|
||||
title: 'Components/VegaLite',
|
||||
title: 'Components/Charts/VegaLite',
|
||||
component: VegaLite,
|
||||
tags: ['autodocs'],
|
||||
argTypes: {
|
||||
@@ -25,7 +25,7 @@ type Story = StoryObj<any>;
|
||||
|
||||
// More on writing stories with args: https://storybook.js.org/docs/react/writing-stories/args
|
||||
export const Primary: Story = {
|
||||
name: 'Chart built with Vega Lite',
|
||||
name: 'Bar chart',
|
||||
args: {
|
||||
data: {
|
||||
table: [
|
||||
|
||||
@@ -53,7 +53,7 @@ export const Nav: React.FC<Props> = ({
|
||||
<nav className="flex justify-between">
|
||||
{/* Mobile navigation */}
|
||||
<div className="mr-2 sm:mr-4 flex lg:hidden">
|
||||
<NavMobile links={links}>{children}</NavMobile>
|
||||
<NavMobile {...{title, links, social, search, defaultTheme, themeToggleIcon}}>{children}</NavMobile>
|
||||
</div>
|
||||
{/* Non-mobile navigation */}
|
||||
<div className="flex flex-none items-center">
|
||||
|
||||
@@ -4,20 +4,16 @@ import { useRouter } from "next/router.js";
|
||||
import { useEffect, useState } from "react";
|
||||
import { SearchContext, SearchField } from "../Search";
|
||||
import { MenuIcon, CloseIcon } from "../Icons";
|
||||
import { NavLink, SearchProviderConfig } from "../types";
|
||||
import type { NavConfig, ThemeConfig } from "./Nav";
|
||||
|
||||
interface Props extends React.PropsWithChildren {
|
||||
author?: string;
|
||||
links?: Array<NavLink>;
|
||||
search?: SearchProviderConfig;
|
||||
}
|
||||
interface Props extends NavConfig, ThemeConfig, React.PropsWithChildren {}
|
||||
|
||||
// TODO why mobile navigation only accepts author and regular nav accepts different things like title, logo, version
|
||||
// TODO: Search doesn't appear
|
||||
export const NavMobile: React.FC<Props> = ({
|
||||
children,
|
||||
title,
|
||||
links,
|
||||
search,
|
||||
author,
|
||||
}) => {
|
||||
const router = useRouter();
|
||||
const [isOpen, setIsOpen] = useState(false);
|
||||
@@ -77,8 +73,8 @@ export const NavMobile: React.FC<Props> = ({
|
||||
legacyBehavior
|
||||
>
|
||||
{/* <Logomark className="h-9 w-9" /> */}
|
||||
<div className="font-extrabold text-primary dark:text-primary-dark text-2xl ml-6">
|
||||
{author}
|
||||
<div className="font-extrabold text-primary dark:text-primary-dark text-lg ml-6">
|
||||
{title}
|
||||
</div>
|
||||
</Link>
|
||||
</div>
|
||||
@@ -106,9 +102,7 @@ export const NavMobile: React.FC<Props> = ({
|
||||
))}
|
||||
</ul>
|
||||
)}
|
||||
{/* <div className="pt-6 border border-t-2">
|
||||
{children}
|
||||
</div> */}
|
||||
<div className="pt-6">{children}</div>
|
||||
</Dialog.Panel>
|
||||
</Dialog>
|
||||
</>
|
||||
|
||||
@@ -46,8 +46,8 @@ export const SiteToc: React.FC<Props> = ({ currentPath, nav }) => {
|
||||
|
||||
return (
|
||||
<nav data-testid="lhs-sidebar" className="flex flex-col space-y-3 text-sm">
|
||||
{sortNavGroupChildren(nav).map((n) => (
|
||||
<NavComponent item={n} isActive={false} />
|
||||
{sortNavGroupChildren(nav).map((n, index) => (
|
||||
<NavComponent key={index} item={n} isActive={false} />
|
||||
))}
|
||||
</nav>
|
||||
);
|
||||
@@ -96,8 +96,8 @@ const NavComponent: React.FC<{
|
||||
leaveTo="transform scale-95 opacity-0"
|
||||
>
|
||||
<Disclosure.Panel className="flex flex-col space-y-3 pl-5 mt-3">
|
||||
{sortNavGroupChildren(item.children).map((subItem) => (
|
||||
<NavComponent item={subItem} isActive={false} />
|
||||
{sortNavGroupChildren(item.children).map((subItem, index) => (
|
||||
<NavComponent key={index} item={subItem} isActive={false} />
|
||||
))}
|
||||
</Disclosure.Panel>
|
||||
</Transition>
|
||||
|
||||
@@ -1,5 +1,11 @@
|
||||
# @portaljs/remark-wiki-link
|
||||
|
||||
## 1.2.0
|
||||
|
||||
### Minor Changes
|
||||
|
||||
- [#1084](https://github.com/datopian/datahub/pull/1084) [`57952e08`](https://github.com/datopian/datahub/commit/57952e0817770138881e7492dc9f43e9910b56a8) Thanks [@mohamedsalem401](https://github.com/mohamedsalem401)! - Add image resize feature
|
||||
|
||||
## 1.1.2
|
||||
|
||||
### Patch Changes
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
{
|
||||
"name": "@portaljs/remark-wiki-link",
|
||||
"version": "1.1.2",
|
||||
"version": "1.2.0",
|
||||
"description": "Parse and render wiki-style links in markdown especially Obsidian style links.",
|
||||
"repository": {
|
||||
"type": "git",
|
||||
|
||||
@@ -1,23 +1,23 @@
|
||||
import { isSupportedFileFormat } from "./isSupportedFileFormat";
|
||||
import { isSupportedFileFormat } from './isSupportedFileFormat';
|
||||
|
||||
const defaultWikiLinkResolver = (target: string) => {
|
||||
// for [[#heading]] links
|
||||
if (!target) {
|
||||
return [];
|
||||
}
|
||||
let permalink = target.replace(/\/index$/, "");
|
||||
let permalink = target.replace(/\/index$/, '');
|
||||
// TODO what to do with [[index]] link?
|
||||
if (permalink.length === 0) {
|
||||
permalink = "/";
|
||||
permalink = '/';
|
||||
}
|
||||
return [permalink];
|
||||
};
|
||||
|
||||
export interface FromMarkdownOptions {
|
||||
pathFormat?:
|
||||
| "raw" // default; use for regular relative or absolute paths
|
||||
| "obsidian-absolute" // use for Obsidian-style absolute paths (with no leading slash)
|
||||
| "obsidian-short"; // use for Obsidian-style shortened paths (shortest path possible)
|
||||
| 'raw' // default; use for regular relative or absolute paths
|
||||
| 'obsidian-absolute' // use for Obsidian-style absolute paths (with no leading slash)
|
||||
| 'obsidian-short'; // use for Obsidian-style shortened paths (shortest path possible)
|
||||
permalinks?: string[]; // list of permalinks to match possible permalinks of a wiki link against
|
||||
wikiLinkResolver?: (name: string) => string[]; // function to resolve wiki links to an array of possible permalinks
|
||||
newClassName?: string; // class name to add to links that don't have a matching permalink
|
||||
@@ -25,14 +25,23 @@ export interface FromMarkdownOptions {
|
||||
hrefTemplate?: (permalink: string) => string; // function to generate the href attribute of a link
|
||||
}
|
||||
|
||||
export function getImageSize(size: string) {
|
||||
// eslint-disable-next-line prefer-const
|
||||
let [width, height] = size.split('x');
|
||||
|
||||
if (!height) height = width;
|
||||
|
||||
return { width, height };
|
||||
}
|
||||
|
||||
// mdas-util-from-markdown extension
|
||||
// https://github.com/syntax-tree/mdast-util-from-markdown#extension
|
||||
function fromMarkdown(opts: FromMarkdownOptions = {}) {
|
||||
const pathFormat = opts.pathFormat || "raw";
|
||||
const pathFormat = opts.pathFormat || 'raw';
|
||||
const permalinks = opts.permalinks || [];
|
||||
const wikiLinkResolver = opts.wikiLinkResolver || defaultWikiLinkResolver;
|
||||
const newClassName = opts.newClassName || "new";
|
||||
const wikiLinkClassName = opts.wikiLinkClassName || "internal";
|
||||
const newClassName = opts.newClassName || 'new';
|
||||
const wikiLinkClassName = opts.wikiLinkClassName || 'internal';
|
||||
const defaultHrefTemplate = (permalink: string) => permalink;
|
||||
|
||||
const hrefTemplate = opts.hrefTemplate || defaultHrefTemplate;
|
||||
@@ -44,9 +53,9 @@ function fromMarkdown(opts: FromMarkdownOptions = {}) {
|
||||
function enterWikiLink(token) {
|
||||
this.enter(
|
||||
{
|
||||
type: "wikiLink",
|
||||
type: 'wikiLink',
|
||||
data: {
|
||||
isEmbed: token.isType === "embed",
|
||||
isEmbed: token.isType === 'embed',
|
||||
target: null, // the target of the link, e.g. "Foo Bar#Heading" in "[[Foo Bar#Heading]]"
|
||||
alias: null, // the alias of the link, e.g. "Foo" in "[[Foo Bar|Foo]]"
|
||||
permalink: null, // TODO shouldn't this be named just "link"?
|
||||
@@ -80,18 +89,18 @@ function fromMarkdown(opts: FromMarkdownOptions = {}) {
|
||||
} = wikiLink;
|
||||
// eslint-disable-next-line no-useless-escape
|
||||
const wikiLinkWithHeadingPattern = /^(.*?)(#.*)?$/u;
|
||||
const [, path, heading = ""] = target.match(wikiLinkWithHeadingPattern);
|
||||
const [, path, heading = ''] = target.match(wikiLinkWithHeadingPattern);
|
||||
|
||||
const possibleWikiLinkPermalinks = wikiLinkResolver(path);
|
||||
|
||||
const matchingPermalink = permalinks.find((e) => {
|
||||
return possibleWikiLinkPermalinks.find((p) => {
|
||||
if (pathFormat === "obsidian-short") {
|
||||
if (pathFormat === 'obsidian-short') {
|
||||
if (e === p || e.endsWith(p)) {
|
||||
return true;
|
||||
}
|
||||
} else if (pathFormat === "obsidian-absolute") {
|
||||
if (e === "/" + p) {
|
||||
} else if (pathFormat === 'obsidian-absolute') {
|
||||
if (e === '/' + p) {
|
||||
return true;
|
||||
}
|
||||
} else {
|
||||
@@ -106,20 +115,19 @@ function fromMarkdown(opts: FromMarkdownOptions = {}) {
|
||||
// TODO this is ugly
|
||||
const link =
|
||||
matchingPermalink ||
|
||||
(pathFormat === "obsidian-absolute"
|
||||
? "/" + possibleWikiLinkPermalinks[0]
|
||||
(pathFormat === 'obsidian-absolute'
|
||||
? '/' + possibleWikiLinkPermalinks[0]
|
||||
: possibleWikiLinkPermalinks[0]) ||
|
||||
"";
|
||||
'';
|
||||
|
||||
wikiLink.data.exists = !!matchingPermalink;
|
||||
wikiLink.data.permalink = link;
|
||||
|
||||
// remove leading # if the target is a heading on the same page
|
||||
const displayName = alias || target.replace(/^#/, "");
|
||||
const headingId = heading.replace(/\s+/g, "-").toLowerCase();
|
||||
const displayName = alias || target.replace(/^#/, '');
|
||||
const headingId = heading.replace(/\s+/g, '-').toLowerCase();
|
||||
let classNames = wikiLinkClassName;
|
||||
if (!matchingPermalink) {
|
||||
classNames += " " + newClassName;
|
||||
classNames += ' ' + newClassName;
|
||||
}
|
||||
|
||||
if (isEmbed) {
|
||||
@@ -127,44 +135,55 @@ function fromMarkdown(opts: FromMarkdownOptions = {}) {
|
||||
if (!isSupportedFormat) {
|
||||
// Temporarily render note transclusion as a regular wiki link
|
||||
if (!format) {
|
||||
wikiLink.data.hName = "a";
|
||||
wikiLink.data.hName = 'a';
|
||||
wikiLink.data.hProperties = {
|
||||
className: classNames + " " + "transclusion",
|
||||
className: classNames + ' ' + 'transclusion',
|
||||
href: hrefTemplate(link) + headingId,
|
||||
};
|
||||
wikiLink.data.hChildren = [{ type: "text", value: displayName }];
|
||||
|
||||
wikiLink.data.hChildren = [{ type: 'text', value: displayName }];
|
||||
} else {
|
||||
wikiLink.data.hName = "p";
|
||||
wikiLink.data.hName = 'p';
|
||||
wikiLink.data.hChildren = [
|
||||
{
|
||||
type: "text",
|
||||
type: 'text',
|
||||
value: `![[${target}]]`,
|
||||
},
|
||||
];
|
||||
}
|
||||
} else if (format === "pdf") {
|
||||
wikiLink.data.hName = "iframe";
|
||||
} else if (format === 'pdf') {
|
||||
wikiLink.data.hName = 'iframe';
|
||||
wikiLink.data.hProperties = {
|
||||
className: classNames,
|
||||
width: "100%",
|
||||
width: '100%',
|
||||
src: `${hrefTemplate(link)}#toolbar=0`,
|
||||
};
|
||||
} else {
|
||||
wikiLink.data.hName = "img";
|
||||
const hasDimensions = alias && /^\d+(x\d+)?$/.test(alias);
|
||||
// Take the target as alt text except if alt name was provided [[target|alt text]]
|
||||
const altText = hasDimensions || !alias ? target : alias;
|
||||
|
||||
wikiLink.data.hName = 'img';
|
||||
wikiLink.data.hProperties = {
|
||||
className: classNames,
|
||||
src: hrefTemplate(link),
|
||||
alt: displayName,
|
||||
alt: altText
|
||||
};
|
||||
|
||||
if (hasDimensions) {
|
||||
const { width, height } = getImageSize(alias as string);
|
||||
Object.assign(wikiLink.data.hProperties, {
|
||||
width,
|
||||
height,
|
||||
});
|
||||
}
|
||||
}
|
||||
} else {
|
||||
wikiLink.data.hName = "a";
|
||||
wikiLink.data.hName = 'a';
|
||||
wikiLink.data.hProperties = {
|
||||
className: classNames,
|
||||
href: hrefTemplate(link) + headingId,
|
||||
};
|
||||
wikiLink.data.hChildren = [{ type: "text", value: displayName }];
|
||||
wikiLink.data.hChildren = [{ type: 'text', value: displayName }];
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
@@ -1,23 +1,24 @@
|
||||
import { isSupportedFileFormat } from "./isSupportedFileFormat";
|
||||
import { getImageSize } from './fromMarkdown';
|
||||
import { isSupportedFileFormat } from './isSupportedFileFormat';
|
||||
|
||||
const defaultWikiLinkResolver = (target: string) => {
|
||||
// for [[#heading]] links
|
||||
if (!target) {
|
||||
return [];
|
||||
}
|
||||
let permalink = target.replace(/\/index$/, "");
|
||||
let permalink = target.replace(/\/index$/, '');
|
||||
// TODO what to do with [[index]] link?
|
||||
if (permalink.length === 0) {
|
||||
permalink = "/";
|
||||
permalink = '/';
|
||||
}
|
||||
return [permalink];
|
||||
};
|
||||
|
||||
export interface HtmlOptions {
|
||||
pathFormat?:
|
||||
| "raw" // default; use for regular relative or absolute paths
|
||||
| "obsidian-absolute" // use for Obsidian-style absolute paths (with no leading slash)
|
||||
| "obsidian-short"; // use for Obsidian-style shortened paths (shortest path possible)
|
||||
| 'raw' // default; use for regular relative or absolute paths
|
||||
| 'obsidian-absolute' // use for Obsidian-style absolute paths (with no leading slash)
|
||||
| 'obsidian-short'; // use for Obsidian-style shortened paths (shortest path possible)
|
||||
permalinks?: string[]; // list of permalinks to match possible permalinks of a wiki link against
|
||||
wikiLinkResolver?: (name: string) => string[]; // function to resolve wiki links to an array of possible permalinks
|
||||
newClassName?: string; // class name to add to links that don't have a matching permalink
|
||||
@@ -28,11 +29,11 @@ export interface HtmlOptions {
|
||||
// Micromark HtmlExtension
|
||||
// https://github.com/micromark/micromark#htmlextension
|
||||
function html(opts: HtmlOptions = {}) {
|
||||
const pathFormat = opts.pathFormat || "raw";
|
||||
const pathFormat = opts.pathFormat || 'raw';
|
||||
const permalinks = opts.permalinks || [];
|
||||
const wikiLinkResolver = opts.wikiLinkResolver || defaultWikiLinkResolver;
|
||||
const newClassName = opts.newClassName || "new";
|
||||
const wikiLinkClassName = opts.wikiLinkClassName || "internal";
|
||||
const newClassName = opts.newClassName || 'new';
|
||||
const wikiLinkClassName = opts.wikiLinkClassName || 'internal';
|
||||
const defaultHrefTemplate = (permalink: string) => permalink;
|
||||
const hrefTemplate = opts.hrefTemplate || defaultHrefTemplate;
|
||||
|
||||
@@ -41,21 +42,21 @@ function html(opts: HtmlOptions = {}) {
|
||||
}
|
||||
|
||||
function enterWikiLink() {
|
||||
let stack = this.getData("wikiLinkStack");
|
||||
if (!stack) this.setData("wikiLinkStack", (stack = []));
|
||||
let stack = this.getData('wikiLinkStack');
|
||||
if (!stack) this.setData('wikiLinkStack', (stack = []));
|
||||
|
||||
stack.push({});
|
||||
}
|
||||
|
||||
function exitWikiLinkTarget(token) {
|
||||
const target = this.sliceSerialize(token);
|
||||
const current = top(this.getData("wikiLinkStack"));
|
||||
const current = top(this.getData('wikiLinkStack'));
|
||||
current.target = target;
|
||||
}
|
||||
|
||||
function exitWikiLinkAlias(token) {
|
||||
const alias = this.sliceSerialize(token);
|
||||
const current = top(this.getData("wikiLinkStack"));
|
||||
const current = top(this.getData('wikiLinkStack'));
|
||||
current.alias = alias;
|
||||
}
|
||||
|
||||
@@ -111,7 +112,9 @@ function html(opts: HtmlOptions = {}) {
|
||||
// Temporarily render note transclusion as a regular wiki link
|
||||
if (!format) {
|
||||
this.tag(
|
||||
`<a href="${hrefTemplate(link + headingId)}" class="${classNames} transclusion">`
|
||||
`<a href="${hrefTemplate(
|
||||
link + headingId
|
||||
)}" class="${classNames} transclusion">`
|
||||
);
|
||||
this.raw(displayName);
|
||||
this.tag("</a>");
|
||||
@@ -125,11 +128,18 @@ function html(opts: HtmlOptions = {}) {
|
||||
)}#toolbar=0" class="${classNames}" />`
|
||||
);
|
||||
} else {
|
||||
this.tag(
|
||||
`<img src="${hrefTemplate(
|
||||
const hasDimensions = alias && /^\d+(x\d+)?$/.test(alias);
|
||||
// Take the target as alt text except if alt name was provided [[target|alt text]]
|
||||
const altText = hasDimensions || !alias ? target : alias;
|
||||
let imgAttributes = `src="${hrefTemplate(
|
||||
link
|
||||
)}" alt="${displayName}" class="${classNames}" />`
|
||||
);
|
||||
)}" alt="${altText}" class="${classNames}"`;
|
||||
|
||||
if (hasDimensions) {
|
||||
const { width, height } = getImageSize(alias as string);
|
||||
imgAttributes += ` width="${width}" height="${height}"`;
|
||||
}
|
||||
this.tag(`<img ${imgAttributes} />`);
|
||||
}
|
||||
} else {
|
||||
this.tag(
|
||||
|
||||
@@ -38,6 +38,5 @@ const defaultPathToPermalinkFunc = (
|
||||
.replace(markdownFolder, "") // make the permalink relative to the markdown folder
|
||||
.replace(/\.(mdx|md)/, "")
|
||||
.replace(/\\/g, "/") // replace windows backslash with forward slash
|
||||
.replace(/\/index$/, ""); // remove index from the end of the permalink
|
||||
return permalink.length > 0 ? permalink : "/"; // for home page
|
||||
};
|
||||
|
||||
@@ -1,9 +1,6 @@
|
||||
import * as path from "path";
|
||||
// import * as url from "url";
|
||||
import { getPermalinks } from "../src/utils";
|
||||
|
||||
// const __dirname = url.fileURLToPath(new URL(".", import.meta.url));
|
||||
// const markdownFolder = path.join(__dirname, "/fixtures/content");
|
||||
const markdownFolder = path.join(
|
||||
".",
|
||||
"test/fixtures/content"
|
||||
@@ -12,12 +9,12 @@ const markdownFolder = path.join(
|
||||
describe("getPermalinks", () => {
|
||||
test("should return an array of permalinks", () => {
|
||||
const expectedPermalinks = [
|
||||
"/", // /index.md
|
||||
"/README",
|
||||
"/abc",
|
||||
"/blog/first-post",
|
||||
"/blog/Second Post",
|
||||
"/blog/third-post",
|
||||
"/blog", // /blog/index.md
|
||||
"/blog/README",
|
||||
"/blog/tutorials/first-tutorial",
|
||||
"/assets/Pasted Image 123.png",
|
||||
];
|
||||
@@ -28,35 +25,4 @@ describe("getPermalinks", () => {
|
||||
expect(expectedPermalinks).toContain(permalink);
|
||||
});
|
||||
});
|
||||
|
||||
test("should return an array of permalinks with custom path -> permalink converter function", () => {
|
||||
const expectedPermalinks = [
|
||||
"/", // /index.md
|
||||
"/abc",
|
||||
"/blog/first-post",
|
||||
"/blog/second-post",
|
||||
"/blog/third-post",
|
||||
"/blog", // /blog/index.md
|
||||
"/blog/tutorials/first-tutorial",
|
||||
"/assets/pasted-image-123.png",
|
||||
];
|
||||
|
||||
const func = (filePath: string, markdownFolder: string) => {
|
||||
const permalink = filePath
|
||||
.replace(markdownFolder, "") // make the permalink relative to the markdown folder
|
||||
.replace(/\.(mdx|md)/, "")
|
||||
.replace(/\\/g, "/") // replace windows backslash with forward slash
|
||||
.replace(/\/index$/, "") // remove index from the end of the permalink
|
||||
.replace(/ /g, "-") // replace spaces with hyphens
|
||||
.toLowerCase(); // convert to lowercase
|
||||
|
||||
return permalink.length > 0 ? permalink : "/"; // for home page
|
||||
};
|
||||
|
||||
const permalinks = getPermalinks(markdownFolder, [/\.DS_Store/], func);
|
||||
expect(permalinks).toHaveLength(expectedPermalinks.length);
|
||||
permalinks.forEach((permalink) => {
|
||||
expect(expectedPermalinks).toContain(permalink);
|
||||
});
|
||||
});
|
||||
});
|
||||
|
||||
@@ -48,7 +48,7 @@ describe("micromark-extension-wiki-link", () => {
|
||||
html({
|
||||
permalinks: ["/some/folder/Wiki Link"],
|
||||
pathFormat: "obsidian-short",
|
||||
}) as any // TODO type fix
|
||||
}) as any, // TODO type fix
|
||||
],
|
||||
});
|
||||
expect(serialized).toBe(
|
||||
@@ -75,7 +75,7 @@ describe("micromark-extension-wiki-link", () => {
|
||||
html({
|
||||
permalinks: ["/some/folder/Wiki Link"],
|
||||
pathFormat: "obsidian-absolute",
|
||||
}) as any // TODO type fix
|
||||
}) as any, // TODO type fix
|
||||
],
|
||||
});
|
||||
expect(serialized).toBe(
|
||||
@@ -97,10 +97,14 @@ describe("micromark-extension-wiki-link", () => {
|
||||
});
|
||||
|
||||
test("parses a wiki link with heading and alias", () => {
|
||||
const serialized = micromark("[[Wiki Link#Some Heading|Alias]]", "ascii", {
|
||||
const serialized = micromark(
|
||||
"[[Wiki Link#Some Heading|Alias]]",
|
||||
"ascii",
|
||||
{
|
||||
extensions: [syntax()],
|
||||
htmlExtensions: [html() as any], // TODO type fix
|
||||
});
|
||||
}
|
||||
);
|
||||
// note: lowercased and hyphenated heading
|
||||
expect(serialized).toBe(
|
||||
'<p><a href="Wiki Link#some-heading" class="internal new">Alias</a></p>'
|
||||
@@ -134,7 +138,7 @@ describe("micromark-extension-wiki-link", () => {
|
||||
extensions: [syntax()],
|
||||
htmlExtensions: [html() as any], // TODO type fix
|
||||
});
|
||||
expect(serialized).toBe("<p>![[My Image.xyz]]</p>");
|
||||
expect(serialized).toBe('<p>![[My Image.xyz]]</p>');
|
||||
});
|
||||
|
||||
test("parses and image ambed with a matching permalink", () => {
|
||||
@@ -147,6 +151,28 @@ describe("micromark-extension-wiki-link", () => {
|
||||
);
|
||||
});
|
||||
|
||||
// TODO: Fix alt attribute
|
||||
test("Can identify the dimensions of the image if exists", () => {
|
||||
const serialized = micromark("![[My Image.jpg|200]]", "ascii", {
|
||||
extensions: [syntax()],
|
||||
htmlExtensions: [html({ permalinks: ["My Image.jpg"] }) as any], // TODO type fix
|
||||
});
|
||||
expect(serialized).toBe(
|
||||
'<p><img src="My Image.jpg" alt="My Image.jpg" class="internal" width="200" height="200" /></p>'
|
||||
);
|
||||
});
|
||||
|
||||
// TODO: Fix alt attribute
|
||||
test("Can identify the dimensions of the image if exists", () => {
|
||||
const serialized = micromark("![[My Image.jpg|200x200]]", "ascii", {
|
||||
extensions: [syntax()],
|
||||
htmlExtensions: [html({ permalinks: ["My Image.jpg"] }) as any], // TODO type fix
|
||||
});
|
||||
expect(serialized).toBe(
|
||||
'<p><img src="My Image.jpg" alt="My Image.jpg" class="internal" width="200" height="200" /></p>'
|
||||
);
|
||||
});
|
||||
|
||||
test("parses an image embed with a matching permalink and Obsidian-style shortedned path", () => {
|
||||
const serialized = micromark("![[My Image.jpg]]", {
|
||||
extensions: [syntax()],
|
||||
@@ -154,7 +180,7 @@ describe("micromark-extension-wiki-link", () => {
|
||||
html({
|
||||
permalinks: ["/assets/My Image.jpg"],
|
||||
pathFormat: "obsidian-short",
|
||||
}) as any // TODO type fix
|
||||
}) as any, // TODO type fix
|
||||
],
|
||||
});
|
||||
expect(serialized).toBe(
|
||||
@@ -189,7 +215,7 @@ describe("micromark-extension-wiki-link", () => {
|
||||
extensions: [syntax()],
|
||||
htmlExtensions: [html() as any], // TODO type fix
|
||||
});
|
||||
expect(serialized).toBe("<p>[[Wiki Link</p>");
|
||||
expect(serialized).toBe('<p>[[Wiki Link</p>');
|
||||
});
|
||||
|
||||
test("doesn't parse a wiki link with one missing closing bracket", () => {
|
||||
@@ -197,7 +223,7 @@ describe("micromark-extension-wiki-link", () => {
|
||||
extensions: [syntax()],
|
||||
htmlExtensions: [html() as any], // TODO type fix
|
||||
});
|
||||
expect(serialized).toBe("<p>[[Wiki Link]</p>");
|
||||
expect(serialized).toBe('<p>[[Wiki Link]</p>');
|
||||
});
|
||||
|
||||
test("doesn't parse a wiki link with a missing opening bracket", () => {
|
||||
@@ -205,7 +231,7 @@ describe("micromark-extension-wiki-link", () => {
|
||||
extensions: [syntax()],
|
||||
htmlExtensions: [html() as any], // TODO type fix
|
||||
});
|
||||
expect(serialized).toBe("<p>[Wiki Link]]</p>");
|
||||
expect(serialized).toBe('<p>[Wiki Link]]</p>');
|
||||
});
|
||||
|
||||
test("doesn't parse a wiki link in single brackets", () => {
|
||||
@@ -213,7 +239,7 @@ describe("micromark-extension-wiki-link", () => {
|
||||
extensions: [syntax()],
|
||||
htmlExtensions: [html() as any], // TODO type fix
|
||||
});
|
||||
expect(serialized).toBe("<p>[Wiki Link]</p>");
|
||||
expect(serialized).toBe('<p>[Wiki Link]</p>');
|
||||
});
|
||||
});
|
||||
|
||||
@@ -225,7 +251,7 @@ describe("micromark-extension-wiki-link", () => {
|
||||
html({
|
||||
newClassName: "test-new",
|
||||
wikiLinkClassName: "test-wiki-link",
|
||||
}) as any // TODO type fix
|
||||
}) as any, // TODO type fix
|
||||
],
|
||||
});
|
||||
expect(serialized).toBe(
|
||||
@@ -251,7 +277,7 @@ describe("micromark-extension-wiki-link", () => {
|
||||
wikiLinkResolver: (page) => [
|
||||
page.replace(/\s+/, "-").toLowerCase(),
|
||||
],
|
||||
}) as any // TODO type fix
|
||||
}) as any, // TODO type fix
|
||||
],
|
||||
});
|
||||
expect(serialized).toBe(
|
||||
@@ -260,56 +286,6 @@ describe("micromark-extension-wiki-link", () => {
|
||||
});
|
||||
});
|
||||
|
||||
test("parses wiki links to index files", () => {
|
||||
const serialized = micromark("[[/some/folder/index]]", "ascii", {
|
||||
extensions: [syntax()],
|
||||
htmlExtensions: [html() as any], // TODO type fix
|
||||
});
|
||||
expect(serialized).toBe(
|
||||
'<p><a href="/some/folder" class="internal new">/some/folder/index</a></p>'
|
||||
);
|
||||
});
|
||||
|
||||
describe("other", () => {
|
||||
test("parses a wiki link to some index page in a folder with no matching permalink", () => {
|
||||
const serialized = micromark("[[/some/folder/index]]", "ascii", {
|
||||
extensions: [syntax()],
|
||||
htmlExtensions: [html() as any], // TODO type fix
|
||||
});
|
||||
expect(serialized).toBe(
|
||||
'<p><a href="/some/folder" class="internal new">/some/folder/index</a></p>'
|
||||
);
|
||||
});
|
||||
|
||||
test("parses a wiki link to some index page in a folder with a matching permalink", () => {
|
||||
const serialized = micromark("[[/some/folder/index]]", "ascii", {
|
||||
extensions: [syntax()],
|
||||
htmlExtensions: [html({ permalinks: ["/some/folder"] }) as any], // TODO type fix
|
||||
});
|
||||
expect(serialized).toBe(
|
||||
'<p><a href="/some/folder" class="internal">/some/folder/index</a></p>'
|
||||
);
|
||||
});
|
||||
|
||||
test("parses a wiki link to home index page with no matching permalink", () => {
|
||||
const serialized = micromark("[[/index]]", "ascii", {
|
||||
extensions: [syntax()],
|
||||
htmlExtensions: [html() as any], // TODO type fix
|
||||
});
|
||||
expect(serialized).toBe(
|
||||
'<p><a href="/" class="internal new">/index</a></p>'
|
||||
);
|
||||
});
|
||||
|
||||
test("parses a wiki link to home index page with a matching permalink", () => {
|
||||
const serialized = micromark("[[/index]]", "ascii", {
|
||||
extensions: [syntax()],
|
||||
htmlExtensions: [html({ permalinks: ["/"] }) as any], // TODO type fix
|
||||
});
|
||||
expect(serialized).toBe('<p><a href="/" class="internal">/index</a></p>');
|
||||
});
|
||||
});
|
||||
|
||||
describe("transclusions", () => {
|
||||
test("parsers a transclusion as a regular wiki link", () => {
|
||||
const serialized = micromark("![[Some Page]]", "ascii", {
|
||||
@@ -330,5 +306,5 @@ describe("micromark-extension-wiki-link", () => {
|
||||
});
|
||||
expect(serialized).toBe(`<p><a href="li nk-w(i)th-àcèô íã_a(n)d_underline!:ª%@'*º$ °~./\\#li-nk-w(i)th-àcèô-íã_a(n)d_underline!:ª%@'*º$-°~./\\" class="internal new">li nk-w(i)th-àcèô íã_a(n)d_underline!:ª%@'*º$ °~./\\#LI NK-W(i)th-àcèô íã_a(n)d_uNdErlinE!:ª%@'*º$ °~./\\</a></p>`);
|
||||
});
|
||||
})
|
||||
});
|
||||
});
|
||||
|
||||
@@ -246,6 +246,28 @@ describe("remark-wiki-link", () => {
|
||||
expect(node.data?.hName).toEqual("img");
|
||||
expect((node.data?.hProperties as any).src).toEqual("My Image.png");
|
||||
expect((node.data?.hProperties as any).alt).toEqual("My Image.png");
|
||||
expect((node.data?.hProperties as any).width).toBeUndefined();
|
||||
expect((node.data?.hProperties as any).height).toBeUndefined();
|
||||
});
|
||||
});
|
||||
|
||||
test("Can identify the dimensions of the image if exists", () => {
|
||||
const processor = unified().use(markdown).use(wikiLinkPlugin);
|
||||
|
||||
let ast = processor.parse("![[My Image.png|132x612]]");
|
||||
ast = processor.runSync(ast);
|
||||
|
||||
expect(select("wikiLink", ast)).not.toEqual(null);
|
||||
|
||||
visit(ast, "wikiLink", (node: Node) => {
|
||||
expect(node.data?.isEmbed).toEqual(true);
|
||||
expect(node.data?.target).toEqual("My Image.png");
|
||||
expect(node.data?.permalink).toEqual("My Image.png");
|
||||
expect(node.data?.hName).toEqual("img");
|
||||
expect((node.data?.hProperties as any).src).toEqual("My Image.png");
|
||||
expect((node.data?.hProperties as any).alt).toEqual("My Image.png");
|
||||
expect((node.data?.hProperties as any).width).toBe("132");
|
||||
expect((node.data?.hProperties as any).height).toBe("612");
|
||||
});
|
||||
});
|
||||
|
||||
@@ -365,13 +387,17 @@ describe("remark-wiki-link", () => {
|
||||
test("parses a link with special characters and symbols", () => {
|
||||
const processor = unified().use(markdown).use(wikiLinkPlugin);
|
||||
|
||||
let ast = processor.parse("[[li nk-w(i)th-àcèô íã_a(n)d_underline!:ª%@'*º$ °~./\\#li-nk-w(i)th-àcèô íã_a(n)D_UNDERLINE!:ª%@'*º$ °~./\\]]");
|
||||
let ast = processor.parse(
|
||||
"[[li nk-w(i)th-àcèô íã_a(n)d_underline!:ª%@'*º$ °~./\\#li-nk-w(i)th-àcèô íã_a(n)D_UNDERLINE!:ª%@'*º$ °~./\\]]"
|
||||
);
|
||||
ast = processor.runSync(ast);
|
||||
expect(select("wikiLink", ast)).not.toEqual(null);
|
||||
|
||||
visit(ast, "wikiLink", (node: Node) => {
|
||||
expect(node.data?.exists).toEqual(false);
|
||||
expect(node.data?.permalink).toEqual("li nk-w(i)th-àcèô íã_a(n)d_underline!:ª%@'*º$ °~./\\");
|
||||
expect(node.data?.permalink).toEqual(
|
||||
"li nk-w(i)th-àcèô íã_a(n)d_underline!:ª%@'*º$ °~./\\"
|
||||
);
|
||||
expect(node.data?.alias).toEqual(null);
|
||||
expect(node.data?.hName).toEqual("a");
|
||||
expect((node.data?.hProperties as any).className).toEqual(
|
||||
@@ -383,9 +409,9 @@ describe("remark-wiki-link", () => {
|
||||
expect((node.data?.hChildren as any)[0].value).toEqual(
|
||||
"li nk-w(i)th-àcèô íã_a(n)d_underline!:ª%@'*º$ °~./\\#li-nk-w(i)th-àcèô íã_a(n)D_UNDERLINE!:ª%@'*º$ °~./\\"
|
||||
);
|
||||
})
|
||||
});
|
||||
})
|
||||
});
|
||||
});
|
||||
|
||||
describe("invalid wiki links", () => {
|
||||
test("doesn't parse a wiki link with two missing closing brackets", () => {
|
||||
@@ -459,109 +485,6 @@ describe("remark-wiki-link", () => {
|
||||
});
|
||||
});
|
||||
|
||||
test("parses wiki links to index files", () => {
|
||||
const processor = unified().use(markdown).use(wikiLinkPlugin);
|
||||
|
||||
let ast = processor.parse("[[/some/folder/index]]");
|
||||
ast = processor.runSync(ast);
|
||||
|
||||
expect(select("wikiLink", ast)).not.toEqual(null);
|
||||
|
||||
visit(ast, "wikiLink", (node: Node) => {
|
||||
expect(node.data?.exists).toEqual(false);
|
||||
expect(node.data?.permalink).toEqual("/some/folder");
|
||||
expect(node.data?.alias).toEqual(null);
|
||||
expect(node.data?.hName).toEqual("a");
|
||||
expect((node.data?.hProperties as any).className).toEqual("internal new");
|
||||
expect((node.data?.hProperties as any).href).toEqual("/some/folder");
|
||||
expect((node.data?.hChildren as any)[0].value).toEqual(
|
||||
"/some/folder/index"
|
||||
);
|
||||
});
|
||||
});
|
||||
|
||||
describe("other", () => {
|
||||
test("parses a wiki link to some index page in a folder with no matching permalink", () => {
|
||||
const processor = unified().use(markdown).use(wikiLinkPlugin);
|
||||
|
||||
let ast = processor.parse("[[/some/folder/index]]");
|
||||
ast = processor.runSync(ast);
|
||||
|
||||
visit(ast, "wikiLink", (node: Node) => {
|
||||
expect(node.data?.exists).toEqual(false);
|
||||
expect(node.data?.permalink).toEqual("/some/folder");
|
||||
expect(node.data?.alias).toEqual(null);
|
||||
expect(node.data?.hName).toEqual("a");
|
||||
expect((node.data?.hProperties as any).className).toEqual(
|
||||
"internal new"
|
||||
);
|
||||
expect((node.data?.hProperties as any).href).toEqual("/some/folder");
|
||||
expect((node.data?.hChildren as any)[0].value).toEqual(
|
||||
"/some/folder/index"
|
||||
);
|
||||
});
|
||||
});
|
||||
|
||||
test("parses a wiki link to some index page in a folder with a matching permalink", () => {
|
||||
const processor = unified()
|
||||
.use(markdown)
|
||||
.use(wikiLinkPlugin, { permalinks: ["/some/folder"] });
|
||||
|
||||
let ast = processor.parse("[[/some/folder/index]]");
|
||||
ast = processor.runSync(ast);
|
||||
|
||||
visit(ast, "wikiLink", (node: Node) => {
|
||||
expect(node.data?.exists).toEqual(true);
|
||||
expect(node.data?.permalink).toEqual("/some/folder");
|
||||
expect(node.data?.alias).toEqual(null);
|
||||
expect(node.data?.hName).toEqual("a");
|
||||
expect((node.data?.hProperties as any).className).toEqual("internal");
|
||||
expect((node.data?.hProperties as any).href).toEqual("/some/folder");
|
||||
expect((node.data?.hChildren as any)[0].value).toEqual(
|
||||
"/some/folder/index"
|
||||
);
|
||||
});
|
||||
});
|
||||
|
||||
test("parses a wiki link to home index page with no matching permalink", () => {
|
||||
const processor = unified().use(markdown).use(wikiLinkPlugin);
|
||||
|
||||
let ast = processor.parse("[[/index]]");
|
||||
ast = processor.runSync(ast);
|
||||
|
||||
visit(ast, "wikiLink", (node: Node) => {
|
||||
expect(node.data?.exists).toEqual(false);
|
||||
expect(node.data?.permalink).toEqual("/");
|
||||
expect(node.data?.alias).toEqual(null);
|
||||
expect(node.data?.hName).toEqual("a");
|
||||
expect((node.data?.hProperties as any).className).toEqual(
|
||||
"internal new"
|
||||
);
|
||||
expect((node.data?.hProperties as any).href).toEqual("/");
|
||||
expect((node.data?.hChildren as any)[0].value).toEqual("/index");
|
||||
});
|
||||
});
|
||||
|
||||
test("parses a wiki link to home index page with a matching permalink", () => {
|
||||
const processor = unified()
|
||||
.use(markdown)
|
||||
.use(wikiLinkPlugin, { permalinks: ["/"] });
|
||||
|
||||
let ast = processor.parse("[[/index]]");
|
||||
ast = processor.runSync(ast);
|
||||
|
||||
visit(ast, "wikiLink", (node: Node) => {
|
||||
expect(node.data?.exists).toEqual(true);
|
||||
expect(node.data?.permalink).toEqual("/");
|
||||
expect(node.data?.alias).toEqual(null);
|
||||
expect(node.data?.hName).toEqual("a");
|
||||
expect((node.data?.hProperties as any).className).toEqual("internal");
|
||||
expect((node.data?.hProperties as any).href).toEqual("/");
|
||||
expect((node.data?.hChildren as any)[0].value).toEqual("/index");
|
||||
});
|
||||
});
|
||||
});
|
||||
|
||||
describe("transclusions", () => {
|
||||
test("replaces a transclusion with a regular wiki link", () => {
|
||||
const processor = unified().use(markdown).use(wikiLinkPlugin);
|
||||
@@ -586,4 +509,3 @@ describe("remark-wiki-link", () => {
|
||||
});
|
||||
});
|
||||
});
|
||||
|
||||
|
||||
@@ -22,11 +22,41 @@ const items = [
|
||||
sourceUrl: 'https://github.com/FCSCOpendata/frontend',
|
||||
},
|
||||
{
|
||||
title: 'Datahub Open Data',
|
||||
href: 'https://opendata.datahub.io/',
|
||||
image: '/images/showcases/datahub.webp',
|
||||
description: 'Demo Data Portal by DataHub',
|
||||
title: 'Frictionless Data',
|
||||
href: 'https://datahub.io/core/co2-ppm',
|
||||
repository: 'https://github.com/datopian/datahub/tree/main/examples/dataset-frictionless',
|
||||
image: '/images/showcases/frictionless-capture.png',
|
||||
description: 'Progressive open-source framework for building data infrastructure - data management, data integration, data flows, etc. It includes various data standards and provides software to work with data.',
|
||||
},
|
||||
{
|
||||
title: "OpenSpending",
|
||||
image: "/images/showcases/openspending.png",
|
||||
href: "https://www.openspending.org",
|
||||
repository: 'https://github.com/datopian/datahub/tree/main/examples/openspending',
|
||||
description: "OpenSpending is a free, open and global platform to search, visualise and analyse fiscal data in the public sphere."
|
||||
},
|
||||
{
|
||||
title: "FiveThirtyEight",
|
||||
image: "/images/showcases/fivethirtyeight.png",
|
||||
href: "https://fivethirtyeight.portaljs.org/",
|
||||
repository: 'https://github.com/datopian/datahub/tree/main/examples/fivethirtyeight',
|
||||
description: "This is a replica of data.fivethirtyeight.com using PortalJS."
|
||||
},
|
||||
{
|
||||
title: "Github Datasets",
|
||||
image: "/images/showcases/github-datasets.png",
|
||||
href: "https://example.portaljs.org/",
|
||||
repository: 'https://github.com/datopian/datahub/tree/main/examples/github-backed-catalog',
|
||||
description: "A simple data catalog that get its data from a list of GitHub repos that serve as datasets."
|
||||
},
|
||||
{
|
||||
title: "Hatespeech Data",
|
||||
image: "/images/showcases/turing.png",
|
||||
href: "https://hatespeechdata.com/",
|
||||
repository: 'https://github.com/datopian/datahub/tree/main/examples/turing',
|
||||
description: "Datasets annotated for hate speech, online abuse, and offensive language which are useful for training a natural language processing system to detect this online abuse."
|
||||
},
|
||||
|
||||
];
|
||||
|
||||
export default function Showcases() {
|
||||
|
||||
@@ -1,10 +1,6 @@
|
||||
export default function ShowcasesItem({ item }) {
|
||||
return (
|
||||
<a
|
||||
className="rounded overflow-hidden group relative border-1 shadow-lg"
|
||||
target="_blank"
|
||||
href={item.href}
|
||||
>
|
||||
<div className="rounded overflow-hidden group relative border-1 shadow-lg">
|
||||
<div
|
||||
className="bg-cover bg-no-repeat bg-top aspect-video w-full group-hover:blur-sm group-hover:scale-105 transition-all duration-200"
|
||||
style={{ backgroundImage: `url(${item.image})` }}
|
||||
@@ -16,9 +12,48 @@ export default function ShowcasesItem({ item }) {
|
||||
<div className="text-center text-primary-dark">
|
||||
<span className="text-xl font-semibold">{item.title}</span>
|
||||
<p className="text-base font-medium">{item.description}</p>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
<div className="flex justify-center mt-2 gap-2 ">
|
||||
{item.href && (
|
||||
<a
|
||||
target="_blank"
|
||||
className=" text-white w-8 h-8 p-1 bg-primary rounded-full hover:scale-110 transition cursor-pointer z-50"
|
||||
rel="noreferrer"
|
||||
href={item.href}
|
||||
>
|
||||
<svg
|
||||
xmlns="http://www.w3.org/2000/svg"
|
||||
viewBox="0 0 420 420"
|
||||
stroke="white"
|
||||
fill="none"
|
||||
>
|
||||
<path stroke-width="26" d="M209,15a195,195 0 1,0 2,0z" />
|
||||
<path
|
||||
stroke-width="18"
|
||||
d="m210,15v390m195-195H15M59,90a260,260 0 0,0 302,0 m0,240 a260,260 0 0,0-302,0M195,20a250,250 0 0,0 0,382 m30,0 a250,250 0 0,0 0-382"
|
||||
/>
|
||||
</svg>
|
||||
</a>
|
||||
)}
|
||||
{item.repository && (
|
||||
<a
|
||||
target="_blank"
|
||||
rel="noreferrer"
|
||||
className="w-8 h-8 bg-black rounded-full p-1 hover:scale-110 transition cursor-pointer z-50"
|
||||
href={item.repository}
|
||||
>
|
||||
<svg
|
||||
aria-hidden="true"
|
||||
viewBox="0 0 16 16"
|
||||
fill="currentColor"
|
||||
>
|
||||
<path d="M8 0C3.58 0 0 3.58 0 8C0 11.54 2.29 14.53 5.47 15.59C5.87 15.66 6.02 15.42 6.02 15.21C6.02 15.02 6.01 14.39 6.01 13.72C4 14.09 3.48 13.23 3.32 12.78C3.23 12.55 2.84 11.84 2.5 11.65C2.22 11.5 1.82 11.13 2.49 11.12C3.12 11.11 3.57 11.7 3.72 11.94C4.44 13.15 5.59 12.81 6.05 12.6C6.12 12.08 6.33 11.73 6.56 11.53C4.78 11.33 2.92 10.64 2.92 7.58C2.92 6.71 3.23 5.99 3.74 5.43C3.66 5.23 3.38 4.41 3.82 3.31C3.82 3.31 4.49 3.1 6.02 4.13C6.66 3.95 7.34 3.86 8.02 3.86C8.7 3.86 9.38 3.95 10.02 4.13C11.55 3.09 12.22 3.31 12.22 3.31C12.66 4.41 12.38 5.23 12.3 5.43C12.81 5.99 13.12 6.7 13.12 7.58C13.12 10.65 11.25 11.33 9.47 11.53C9.76 11.78 10.01 12.26 10.01 13.01C10.01 14.08 10 14.94 10 15.21C10 15.42 10.15 15.67 10.55 15.59C13.71 14.53 16 11.53 16 8C16 3.58 12.42 0 8 0Z" />
|
||||
</svg>
|
||||
</a>
|
||||
)}
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
);
|
||||
}
|
||||
|
||||
@@ -4,7 +4,7 @@ authors: ['Luccas Mateus']
|
||||
date: 2021-04-20
|
||||
---
|
||||
|
||||
We have created a full data portal demo using PortalJS all backed by a CKAN instance storing data and metadata, you can see below a screenshot of the homepage and of an individual dataset page.
|
||||
We have created a full data portal demo using DataHub PortalJS all backed by a CKAN instance storing data and metadata, you can see below a screenshot of the homepage and of an individual dataset page.
|
||||
|
||||

|
||||

|
||||
@@ -14,7 +14,7 @@ We have created a full data portal demo using PortalJS all backed by a CKAN inst
|
||||
To create a Portal app, run the following command in your terminal:
|
||||
|
||||
```console
|
||||
npx create-next-app -e https://github.com/datopian/portaljs/tree/main/examples/ckan
|
||||
npx create-next-app -e https://github.com/datopian/datahub/tree/main/examples/ckan
|
||||
```
|
||||
|
||||
> NB: Under the hood, this uses the tool called create-next-app, which bootstraps an app for you based on our CKAN example.
|
||||
|
||||
@@ -11,19 +11,18 @@ const config = {
|
||||
authorUrl: 'https://datopian.com/',
|
||||
navbarTitle: {
|
||||
// logo: "/images/logo.svg",
|
||||
text: '🌀 PortalJS',
|
||||
text: '🌀 DataHub PortalJS',
|
||||
// version: "Alpha",
|
||||
},
|
||||
navLinks: [
|
||||
{ name: 'Docs', href: '/docs' },
|
||||
// { name: "Components", href: "/docs/components" },
|
||||
{ name: 'Blog', href: '/blog' },
|
||||
{ name: 'Showcases', href: '/#showcases' },
|
||||
{ name: 'Howtos', href: '/howtos' },
|
||||
{ name: 'Guide', href: '/guide' },
|
||||
{
|
||||
name: 'Examples',
|
||||
href: '/examples/'
|
||||
name: 'Showcases',
|
||||
href: '/showcases/'
|
||||
},
|
||||
{
|
||||
name: 'Components',
|
||||
@@ -45,6 +44,7 @@ const config = {
|
||||
{ rel: 'icon', href: '/favicon.ico' },
|
||||
{ rel: 'apple-touch-icon', href: '/icon.png', sizes: '120x120' },
|
||||
],
|
||||
canonical: 'https://portaljs.com/',
|
||||
openGraph: {
|
||||
type: 'website',
|
||||
title:
|
||||
@@ -68,8 +68,8 @@ const config = {
|
||||
cardType: 'summary_large_image',
|
||||
},
|
||||
},
|
||||
github: 'https://github.com/datopian/portaljs',
|
||||
discord: 'https://discord.gg/xfFDMPU9dC',
|
||||
github: 'https://github.com/datopian/datahub',
|
||||
discord: 'https://discord.gg/KrRzMKU',
|
||||
tableOfContents: true,
|
||||
analytics: 'G-96GWZHMH57',
|
||||
// editLinkShow: true,
|
||||
|
||||
@@ -26,7 +26,7 @@ Below are some screenshots:
|
||||
- Create a new app with `create-next-app`:
|
||||
|
||||
```
|
||||
npx create-next-app <app-name> --example https://github.com/datopian/portaljs/tree/main/examples/ckan-example
|
||||
npx create-next-app <app-name> --example https://github.com/datopian/datahub/tree/main/examples/ckan-example
|
||||
cd <app-name>
|
||||
```
|
||||
|
||||
@@ -49,7 +49,7 @@ If yo go to any one of those pages by clicking on `More info` you will see somet
|
||||
|
||||
## Deployment
|
||||
|
||||
[](https://vercel.com/new/clone?repository-url=https%3A%2F%2Fgithub.com%2Fdatopian%2Fportaljs%2Ftree%2Fmain%2Fexamples%2Fckan-example&env=DMS&envDescription=URL%20For%20the%20CKAN%20Backend%20Ex%3A%20https%3A%2F%2Fdemo.dev.datopian.com)
|
||||
[](https://vercel.com/new/clone?repository-url=https%3A%2F%2Fgithub.com%2Fdatopian%2Fdatahub%2Ftree%2Fmain%2Fexamples%2Fckan-example&env=DMS&envDescription=URL%20For%20the%20CKAN%20Backend%20Ex%3A%20https%3A%2F%2Fdemo.dev.datopian.com)
|
||||
|
||||
By clicking on this button, you will be redirected to a page which will allow you to clone the content into your own github/gitlab/bitbucket account and automatically deploy everything.
|
||||
|
||||
@@ -70,6 +70,6 @@ npm run start
|
||||
|
||||
## Links
|
||||
|
||||
- [Repo](https://github.com/datopian/portaljs/tree/main/examples/ckan-example)
|
||||
- [Repo](https://github.com/datopian/datahub/tree/main/examples/ckan-example)
|
||||
- [Live Demo](https://ckan-example.portaljs.org)
|
||||
|
||||
|
||||
@@ -26,7 +26,7 @@ To get a feel of the project, check out the demo at [live deployment](https://ck
|
||||
Navigate to the directory in which you want to create the project folder and run the following command:
|
||||
|
||||
```
|
||||
npx create-next-app <app-name> --example https://github.com/datopian/portaljs/tree/main/examples/ckan
|
||||
npx create-next-app <app-name> --example https://github.com/datopian/datahub/tree/main/examples/ckan
|
||||
cd <app-name>
|
||||
```
|
||||
|
||||
@@ -56,7 +56,7 @@ If you navigate to any of the dataset pages by clicking on the dataset title you
|
||||
|
||||
## Deployment
|
||||
|
||||
[](https://vercel.com/new/clone?repository-url=https%3A%2F%2Fgithub.com%2Fdatopian%2Fportaljs%2Ftree%2Fmain%2Fexamples%2Fckan&env=DMS&envDescription=URL%20For%20the%20CKAN%20Backend%20Ex%3A%20https%3A%2F%2Fdemo.dev.datopian.com)
|
||||
[](https://vercel.com/new/clone?repository-url=https%3A%2F%2Fgithub.com%2Fdatopian%2Fdatahub%2Ftree%2Fmain%2Fexamples%2Fckan&env=DMS&envDescription=URL%20For%20the%20CKAN%20Backend%20Ex%3A%20https%3A%2F%2Fdemo.dev.datopian.com)
|
||||
|
||||
By clicking on this button, you will be redirected to a page which allows you to clone the base project into your own GitHub/GitLab/BitBucket account and automatically deploy it.
|
||||
|
||||
@@ -158,6 +158,6 @@ Thanks to TypeScript, you can get a list of all the API methods in `@portaljs/ck
|
||||
|
||||
## Links
|
||||
|
||||
- [Repo](https://github.com/datopian/portaljs/tree/main/examples/ckan)
|
||||
- [Repo](https://github.com/datopian/datahub/tree/main/examples/ckan)
|
||||
- [Live Demo](http://ckan.portaljs.org/)
|
||||
|
||||
|
||||
@@ -1,48 +0,0 @@
|
||||
---
|
||||
title: "Example: showcase for a single Frictionless dataset"
|
||||
authors: ['Luccas Mateus']
|
||||
date: 2023-04-20
|
||||
filetype: blog
|
||||
---
|
||||
|
||||
**See the repo:** https://github.com/datopian/portaljs/tree/main/examples/dataset-frictionless
|
||||
|
||||
This example creates a portal/showcase for a single dataset. The dataset should be a [Frictionless dataset (data package)][fd] i.e. there should be a `datapackage.json`.
|
||||
|
||||
[fd]: https://frictionlessdata.io/data-packages/
|
||||
|
||||
## How to use
|
||||
|
||||
```bash
|
||||
npx create-next-app -e https://github.com/datopian/portaljs/tree/main/examples/dataset-frictionless
|
||||
# choose a name for your portal when prompted e.g. your-portal or go with default my-app
|
||||
|
||||
# then run it
|
||||
cd your-portal
|
||||
yarn #install packages
|
||||
yarn dev #start app in dev mode
|
||||
```
|
||||
|
||||
You should see the demo portal running with the example dataset provided:
|
||||
|
||||
<img src="/assets/examples/frictionless-dataset-demo.gif" />
|
||||
|
||||
### Use your own dataset
|
||||
|
||||
You can try it out with other [Frictionless datasets](https://datahub.io/search).
|
||||
|
||||
In the directory of your portal do:
|
||||
|
||||
```bash
|
||||
export PORTAL_DATASET_PATH=/path/to/my/dataset
|
||||
```
|
||||
|
||||
Then restart the dev server:
|
||||
|
||||
```
|
||||
yarn dev
|
||||
```
|
||||
|
||||
Check the portal page and it should have updated e.g. like:
|
||||
|
||||

|
||||
@@ -33,7 +33,7 @@ Run the following commands:
|
||||
|
||||
|
||||
```bash
|
||||
npx create-next-app <app-name> --example https://github.com/datopian/portaljs/tree/main/examples/github-backed-catalog
|
||||
npx create-next-app <app-name> --example https://github.com/datopian/datahub/tree/main/examples/github-backed-catalog
|
||||
cd <app-name>
|
||||
```
|
||||
|
||||
@@ -61,7 +61,7 @@ Congratulations, your new app is now running at http://localhost:3000.
|
||||
|
||||
## Deployment
|
||||
|
||||
[](https://vercel.com/new/clone?repository-url=https%3A%2F%2Fgithub.com%2Fdatopian%2Fportaljs%2Ftree%2Fmain%2Fexamples%2Fgithub-backed-catalog)
|
||||
[](https://vercel.com/new/clone?repository-url=https%3A%2F%2Fgithub.com%2Fdatopian%2Fdatahub%2Ftree%2Fmain%2Fexamples%2Fgithub-backed-catalog)
|
||||
|
||||
By clicking on this button, you will be redirected to a page which will allow you to clone the example into your own GitHub/GitLab/BitBucket account and automatically deploy it.
|
||||
|
||||
@@ -119,5 +119,5 @@ npm run start
|
||||
|
||||
## Links
|
||||
|
||||
- [Repo](https://github.com/datopian/portaljs/tree/main/examples/github-backed-catalog)
|
||||
- [Repo](https://github.com/datopian/datahub/tree/main/examples/github-backed-catalog)
|
||||
- [Live Demo](https://example.portaljs.org)
|
||||
|
||||
@@ -3,9 +3,9 @@ title: Getting Started
|
||||
description: 'Getting started guide and tutorial about data portal-building with PortalJS!'
|
||||
---
|
||||
|
||||
Welcome to the PortalJS documentation!
|
||||
Welcome to the DataHub PortalJS documentation!
|
||||
|
||||
If you have questions about anything related to PortalJS, you're always welcome to ask our community on [GitHub Discussions](https://github.com/datopian/portaljs/discussions) or on [our chat channel on Discord](https://discord.gg/EeyfGrGu4U).
|
||||
If you have questions about anything related to PortalJS, you're always welcome to ask our community on [GitHub Discussions](https://github.com/datopian/datahub/discussions) or on [our chat channel on Discord](https://discord.com/invite/KrRzMKU).
|
||||
|
||||
## Setup
|
||||
|
||||
@@ -16,10 +16,10 @@ If you have questions about anything related to PortalJS, you're always welcome
|
||||
|
||||
### Create a PortalJS app
|
||||
|
||||
To create a PortalJS app, open your terminal, cd into the directory you’d like to create the app in, and run the following command:
|
||||
To create a DataHub PortalJS app, open your terminal, cd into the directory you’d like to create the app in, and run the following command:
|
||||
|
||||
```bash
|
||||
npx create-next-app my-data-portal --example https://github.com/datopian/portaljs/tree/main/examples/learn
|
||||
npx create-next-app my-data-portal --example https://github.com/datopian/datahub/tree/main/examples/learn
|
||||
```
|
||||
|
||||
> [!tip]
|
||||
|
||||
@@ -1,5 +0,0 @@
|
||||
# Examples
|
||||
|
||||
For now, see the examples folder in github:
|
||||
|
||||
https://github.com/datopian/portaljs/tree/main/examples
|
||||
@@ -11,5 +11,5 @@ description: Learn more about how you can achieve different data portal features
|
||||
- [[howtos/drd|How to create data-rich documents with charts and tables?]]
|
||||
- [[howtos/comments|How to add user comments?]]
|
||||
|
||||
If you have questions about anything related to PortalJS, you're always welcome to ask our community on [GitHub Discussions](https://github.com/datopian/portaljs/discussions) or on [our chat channel on Discord](https://discord.gg/EeyfGrGu4U).
|
||||
If you have questions about anything related to PortalJS, you're always welcome to ask our community on [GitHub Discussions](https://github.com/datopian/datahub/discussions) or on [our chat channel on Discord](https://discord.gg/EeyfGrGu4U).
|
||||
|
||||
|
||||
@@ -50,7 +50,7 @@ function MyApp({ Component, pageProps }) {
|
||||
<DefaultSeo
|
||||
defaultTitle={siteConfig.title}
|
||||
description={siteConfig.description}
|
||||
titleTemplate="PortalJS - %s"
|
||||
titleTemplate="DataHub PortalJS - %s"
|
||||
{...siteConfig.nextSeo}
|
||||
/>
|
||||
|
||||
|
||||
@@ -35,7 +35,7 @@ export default function Home({ sidebarTree }) {
|
||||
sidebarTree={sidebarTree}
|
||||
>
|
||||
<Features />
|
||||
<Showcases />
|
||||
|
||||
<Community />
|
||||
</Layout>
|
||||
</>
|
||||
|
||||
8
site/pages/showcases.tsx
Normal file
8
site/pages/showcases.tsx
Normal file
@@ -0,0 +1,8 @@
|
||||
import Layout from "@/components/Layout";
|
||||
import Showcases from "@/components/Showcases";
|
||||
|
||||
export default function ShowcasesList() {
|
||||
return (
|
||||
<Layout><Showcases/></Layout>
|
||||
)
|
||||
}
|
||||
BIN
site/public/images/showcases/fivethirtyeight.png
Normal file
BIN
site/public/images/showcases/fivethirtyeight.png
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 66 KiB |
BIN
site/public/images/showcases/frictionless-capture.png
Normal file
BIN
site/public/images/showcases/frictionless-capture.png
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 70 KiB |
BIN
site/public/images/showcases/github-datasets.png
Normal file
BIN
site/public/images/showcases/github-datasets.png
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 41 KiB |
BIN
site/public/images/showcases/openspending.png
Normal file
BIN
site/public/images/showcases/openspending.png
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 62 KiB |
BIN
site/public/images/showcases/turing.png
Normal file
BIN
site/public/images/showcases/turing.png
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 88 KiB |
@@ -1,7 +1,7 @@
|
||||
#!/bin/bash
|
||||
rm -rf portal
|
||||
mkdir -p portal
|
||||
npx create-next-app portal -e https://github.com/datopian/portaljs/tree/main/examples/dataset-frictionless
|
||||
npx create-next-app portal -e https://github.com/datopian/datahub/tree/main/examples/dataset-frictionless
|
||||
mkdir portal/public/dataset
|
||||
|
||||
cp -a ./data portal/public/dataset
|
||||
@@ -12,7 +12,7 @@ PORTAL_DATASET_PATH=$PWD"/portal/public/dataset"
|
||||
export PORTAL_DATASET_PATH
|
||||
|
||||
mkdir -p .github && mkdir -p .github/workflows && touch .github/workflows/main.yml
|
||||
curl https://raw.githubusercontent.com/datopian/portaljs/main/site/public/scripts/gh-page-builder-action.yml > .github/workflows/main.yml
|
||||
curl https://raw.githubusercontent.com/datopian/datahub/main/site/public/scripts/gh-page-builder-action.yml > .github/workflows/main.yml
|
||||
|
||||
cd portal
|
||||
assetPrefix='"/'$PORTAL_REPO_NAME'/"'
|
||||
|
||||
@@ -3,7 +3,7 @@ git checkout -b gh-pages
|
||||
git rm -r --cached .
|
||||
rm -rf portal
|
||||
mkdir -p portal
|
||||
npx create-next-app portal -e https://github.com/datopian/portaljs/tree/main/examples/dataset-frictionless
|
||||
npx create-next-app portal -e https://github.com/datopian/datahub/tree/main/examples/dataset-frictionless
|
||||
mkdir portal/public/dataset
|
||||
|
||||
cp -a ./data portal/public/dataset
|
||||
|
||||
Reference in New Issue
Block a user