1
0
Fork 0

linting tweaks

This commit is contained in:
Matthias Kretschmann 2023-10-04 01:54:41 +01:00
parent ab582a5093
commit 7ed6830df0
Signed by: m
GPG Key ID: 606EEEF3C479A91F
21 changed files with 92 additions and 75 deletions

View File

@ -2,9 +2,5 @@
"default": true,
"line_length": false,
"no-inline-html": false,
"fenced-code-language": false,
"first-line-h1": false,
"first-heading-h1": false,
"no-bare-urls": false,
"no-trailing-punctuation": false
"first-line-h1": false
}

View File

@ -9,4 +9,4 @@ tags:
- wordpress
---
A fresh start of my website under the new domain www.kremalicious.com and with [Wordpress](http://www.wordpress.org) under the hood. My former website under [www.jpberlin.de/krema](http://www.jpberlin.de/krema) will no longer be updated and will be deleted soon. I just implemented a few posts from the old weblog.
A fresh start of my website under the new domain `www.kremalicious.com` and with [Wordpress](http://www.wordpress.org) under the hood. My former website under [www.jpberlin.de/krema](http://www.jpberlin.de/krema) will no longer be updated and will be deleted soon. I just implemented a few posts from the old weblog.

View File

@ -91,7 +91,7 @@ sudo vi /etc/default/netatalk
vim should pop up with the defined file loaded as superuser (needed for saving). Find the "#Set which daemons to run" part and replace the default values with these to enable just AFP and disable all unneeded services. Let the cnid_meta daemon run too and if you want to [share your Linux connected printer with your Mac](http://www.zaphu.com/2008/04/29/ubuntu-guide-configure-netatalk-to-share-a-usb-printer/) also enable the pap daemon (set to yes):
```
```ini
ATALKD_RUN=no
PAPD_RUN=no
CNID_METAD_RUN=yes
@ -108,9 +108,9 @@ Press Ctrl + S to save the document or choose File > Save. Next we have to edit
sudo vi /etc/netatalk/afpd.conf
```
Scroll to the very bottom of the document and add this to the bottom (replace the whole line in case there's already one). This is one line so be sure that there's no line break in your afpd.conf file:
Scroll to the very bottom of the document and add this to the bottom (replace the whole line in case there's already one). This is one line so be sure that there's no line break in your `afpd.conf` file:
```
```ini
-transall -uamlist uams_randnum.so,uams_dhx.so -nosavepassword -advertise_ssh
```
@ -126,19 +126,19 @@ sudo vi /etc/netatalk/AppleVolumes.default
Scroll to the bottom of the document and define your Volume shares. By adding the following line you will share each users home directory with the user name as the Volume name. To make things more secure you can define all users who are allowed to connect to your Ubuntu box via AFP:
```
```ini
~/ "$u" allow:username1,username2 cnidscheme:cdb
```
Because we want to use the Ubuntu machine as a backup server for Time Machine you should define a second volume just for Time Machine. Create a new folder in your home directory first and name it TimeMachine (or anything you like). Then add the following line to your AppleVolumes.default. This is one line so be sure that theres no line break in your `AppleVolumes.default` file:
```
```ini
/home/username/TimeMachine TimeMachine allow:username1,username2 cnidscheme:cdb options:usedots,upriv
```
Thanks to [tsanga](http://www.kremalicious.com/2008/06/ubuntu-as-mac-file-server-and-time-machine-volume/#comment-50) for pointing out the usedots and upriv options. The usedots option is required if you want to use invisible files and folders (those starting with a dot in the name). Otherwise afpd would encode them as :2e which is bad if you have to use invisible files (like .htaccess). If you're on Leopard **and have no Tiger installed Macs in your network or mixed OS X versions in your network** you should use the upriv option which adds support for AFP3 unix privileges. If you have Macs with Tiger installed just use options:usedots to avoid unexpected behavior:
```
```ini
/home/username/TimeMachine TimeMachine allow:username1,username2 cnidscheme:cdb options:usedots
```
@ -171,7 +171,7 @@ sudo vi /etc/nsswitch.conf
Just add `mdns` at the end of the line that starts with `hosts:`. Now the line should look like this:
```
```ini
hosts: files mdns4_minimal [NOTFOUND=return] dns mdns4 mdns
```
@ -288,15 +288,11 @@ In short you have to allow communications over port 548 and 5353.
If you get one of those errors:
```
Connection Failed - There was an error connection to the server. Check the server name or IP address and try again
```
> Connection Failed - There was an error connection to the server. Check the server name or IP address and try again
or
```
There was an error connecting to the server. Check the server name or IP address and try again. If you are unable to resolve the problem contact your system administrator.
```
> There was an error connecting to the server. Check the server name or IP address and try again. If you are unable to resolve the problem contact your system administrator.
you should first be sure you have either no firewall on your Ubuntu box in use or have it configured to allow AFP communications as suggested in the above paragraph.
@ -310,7 +306,7 @@ sudo vi /etc/hosts
Add the following two lines at the very top of the file.
```
```ini
127.0.0.1 localhost
127.0.1.1 Rockhopper.local Rockhopper
```
@ -383,7 +379,7 @@ Because I've just modified Apple's standard icons these icons are just available
In the avahi configuration part in this article you have assigned the Xserve device info to your afpd.service file. All you have to do is to replace the generic Xserve icon (or whatever model you have assigned in your afpd.service file) with an icon from this icon package. Just rename the Ubuntu Server.icns to com.apple.xserve.icns and navigate to
```
```bash
/System/Library/CoreServices/CoreTypes.bundle/Contents/Resources
```
@ -393,7 +389,7 @@ If you've used another model in your afpd.service file, browse the Resources of
As for the Windows Vista server icon: Just rename the Windows Server.icns file to public.generic-pc.icns and navigate to
```
```bash
/System/Library/CoreServices/CoreTypes.bundle/Contents/Resources
```

File diff suppressed because one or more lines are too long

View File

@ -90,7 +90,7 @@ Ive decided to keep this functionality intact but hide the whole comment sect
The RSS feeds for posts and comments have been moved to Feedburner. You can just use kremalicious.com/feed.xml.
Currently it looks like Google has some problems recognizing kremalicious.com/feed as the replacement for www.kremalicious.com/feed so it may show you multiple entries if you search for it in Google Reader.
Currently it looks like Google has some problems recognizing `kremalicious.com/feed` as the replacement for `www.kremalicious.com/feed` so it may show you multiple entries if you search for it in Google Reader.
### Twitter & Google+

View File

@ -85,7 +85,7 @@ In the background, the code base changed drastically. We now have only one Aquar
If youre interested in all the detailed code changes, you can follow along with the [main Pull Request](https://github.com/oceanprotocol/market/pull/628) which has references and screenshots for all other changes done against it. This is also the best place to start if you run your own fork of the market and want to integrate the latest multi-network changes without looking at just one big change in `main`.
### Check it out!
### Check it out
Head to [market.oceanprotocol.com](https://market.oceanprotocol.com) and see assets currently mixed from 3 networks by default.

View File

@ -28,7 +28,7 @@ One significant advantage of generating favicons dynamically is cache busting. W
To begin, these are the source files we will deal with, with only 2 image assets:
```
```text
my-astro-project/
├── src/
│ ├── pages/
@ -56,7 +56,7 @@ After building the project, the generated favicon files will be placed in the `d
This should be present in your `dist/` folder after following the rest of this article:
```
```text
my-astro-project/
├── dist/
│ ├── favicon.ico

28
package-lock.json generated
View File

@ -5406,9 +5406,9 @@
}
},
"node_modules/caniuse-lite": {
"version": "1.0.30001538",
"resolved": "https://registry.npmjs.org/caniuse-lite/-/caniuse-lite-1.0.30001538.tgz",
"integrity": "sha512-HWJnhnID+0YMtGlzcp3T9drmBJUVDchPJ08tpUGFLs9CYlwWPH2uLgpHn8fND5pCgXVtnGS3H4QR9XLMHVNkHw==",
"version": "1.0.30001543",
"resolved": "https://registry.npmjs.org/caniuse-lite/-/caniuse-lite-1.0.30001543.tgz",
"integrity": "sha512-qxdO8KPWPQ+Zk6bvNpPeQIOH47qZSYdFZd6dXQzb2KzhnSXju4Kd7H1PkSJx6NICSMgo/IhRZRhhfPTHYpJUCA==",
"funding": [
{
"type": "opencollective",
@ -8005,19 +8005,19 @@
"integrity": "sha512-IaOQ9puYtjrkq7Y0Ygl9KDZnrf/aiUJYUpVf89y8kyaxbRG7Y1SrX/jaumrv81vc61+kiMempujsM3Yw7w5qcw=="
},
"node_modules/glob": {
"version": "10.3.6",
"resolved": "https://registry.npmjs.org/glob/-/glob-10.3.6.tgz",
"integrity": "sha512-mEfImdc/fiYHEcF6pHFfD2b/KrdFB1qH9mRe5vI5HROF8G51SWxQJ2V56Ezl6ZL9y86gsxQ1Lgo2S746KGUPSQ==",
"version": "10.3.10",
"resolved": "https://registry.npmjs.org/glob/-/glob-10.3.10.tgz",
"integrity": "sha512-fa46+tv1Ak0UPK1TOy/pZrIybNNt4HCv7SDzwyfiOZkvZLEbjsZkJBPtDHVshZjbecAoAGSC20MjLDG/qr679g==",
"dev": true,
"dependencies": {
"foreground-child": "^3.1.0",
"jackspeak": "^2.0.3",
"jackspeak": "^2.3.5",
"minimatch": "^9.0.1",
"minipass": "^5.0.0 || ^6.0.2 || ^7.0.0",
"path-scurry": "^1.10.1"
},
"bin": {
"glob": "dist/cjs/src/bin.js"
"glob": "dist/esm/bin.mjs"
},
"engines": {
"node": ">=16 || 14 >=14.17"
@ -9407,9 +9407,9 @@
}
},
"node_modules/jackspeak": {
"version": "2.3.3",
"resolved": "https://registry.npmjs.org/jackspeak/-/jackspeak-2.3.3.tgz",
"integrity": "sha512-R2bUw+kVZFS/h1AZqBKrSgDmdmjApzgY0AlCPumopFiAlbUxE2gf+SCuBzQ0cP5hHmUmFYF5yw55T97Th5Kstg==",
"version": "2.3.6",
"resolved": "https://registry.npmjs.org/jackspeak/-/jackspeak-2.3.6.tgz",
"integrity": "sha512-N3yCS/NegsOBokc8GAdM8UcmfsKiSS8cipheD/nivzr700H+nsMOxJjQnvwOcRYVuFkdH0wGUvW2WbXGmrZGbQ==",
"dev": true,
"dependencies": {
"@isaacs/cliui": "^8.0.2"
@ -12523,9 +12523,9 @@
}
},
"node_modules/minipass": {
"version": "7.0.3",
"resolved": "https://registry.npmjs.org/minipass/-/minipass-7.0.3.tgz",
"integrity": "sha512-LhbbwCfz3vsb12j/WkWQPZfKTsgqIe1Nf/ti1pKjYESGLHIVjWU96G9/ljLH4F9mWNVhlQOm0VySdAWzf05dpg==",
"version": "7.0.4",
"resolved": "https://registry.npmjs.org/minipass/-/minipass-7.0.4.tgz",
"integrity": "sha512-jYofLM5Dam9279rdkWzqHozUo4ybjdZmCsDHePy5V/PbBcVMiSZR97gmAy45aqi8CK1lG2ECd356FU86avfwUQ==",
"dev": true,
"engines": {
"node": ">=16 || 14 >=14.17"

View File

@ -11,15 +11,15 @@
"build": "astro build --config '.config/astro.config.ts'",
"preview": "astro preview",
"typecheck:astro": "astro check",
"typecheck:tsc": "tsc --noEmit",
"typecheck": "npm run typecheck:astro && npm run typecheck:tsc",
"typecheck:tsc": "tsc --noEmit --pretty",
"typecheck": "run-p typecheck:astro typecheck:tsc",
"prebuild": "run-p --silent --continue-on-error create:symlinks create:icons move:downloads",
"test:unit": "vitest run --config './test/vitest.config.ts' --coverage",
"test:e2e": "playwright test --config './test/playwright.config.ts'",
"lint": "run-p --silent lint:js lint:css lint:md",
"lint:js": "eslint --ignore-path .gitignore --ext .ts,.tsx,.astro,.mjs,.js,.cjs .",
"lint:js": "eslint --ignore-path .gitignore --ext .ts,.tsx,.astro,.mjs,.js,.cjs ./{src,test,scripts}/",
"lint:css": "stylelint --config '.config/.stylelintrc.json' 'src/**/*.css'",
"lint:md": "markdownlint --config '.config/.markdownlint.json' './**/*.{md,markdown}' --ignore './{node_modules,public,.cache,dist,.git,coverage}/**/*'",
"lint:md": "markdownlint --config '.config/.markdownlint.json' --ignore-path .gitignore --dot './**/*.{md,markdown}'",
"format": "prettier --ignore-path .gitignore --write '**/*.{js,jsx,ts,tsx,md,json,css,astro,yml}'",
"deploy:s3": "./scripts/deploy-s3.sh",
"new": "ts-node --esm scripts/new/index.ts",
@ -107,10 +107,15 @@
"type": "git",
"url": "https://github.com/kremalicious/blog.git"
},
"browserslist": [
">0.2%",
"not dead",
"not ie <= 11",
"not op_mini all"
]
"browserslist": {
"production": [
"defaults",
">0.2%"
],
"development": [
"last 1 chrome version",
"last 1 firefox version",
"last 1 safari version"
]
}
}

View File

@ -3,8 +3,9 @@ import config from '@config/blog.config'
import { RainbowKitProvider } from '@rainbow-me/rainbowkit'
import { WagmiConfig } from 'wagmi'
import { wagmiConfig, chains, theme } from '@lib/rainbowkit'
import type { ReactElement } from 'react'
export default function Web3() {
export default function Web3(): ReactElement {
return (
<WagmiConfig config={wagmiConfig}>
<RainbowKitProvider chains={chains} theme={theme}>

View File

@ -12,15 +12,16 @@ export function getTransactionMessage(transactionHash?: string): {
}
}
const constructMessage = (
function constructMessage(
transactionHash: string,
message?: { text?: string }
) =>
transactionHash
): string | undefined {
return transactionHash
? message?.text +
'<br /><br />' +
getTransactionMessage(transactionHash).transaction
'<br /><br />' +
getTransactionMessage(transactionHash).transaction
: message && message.text
}
const classes = (status: string) =>
status === 'success'

View File

@ -1,10 +1,13 @@
import { type ReactElement, useEffect, useState } from 'react'
import styles from './Conversion.module.css'
export async function getFiat(
amount: number,
export async function getFiat({
amount,
tokenId = 'ethereum'
): Promise<{ [key: string]: string }> {
}: {
amount: number
tokenId?: string
}): Promise<{ [key: string]: string }> {
const url = `https://api.coingecko.com/api/v3/simple/price?ids=${tokenId}&vs_currencies=eur%2Cusd`
const response = await fetch(url)
const json = await response.json()
@ -34,7 +37,10 @@ export default function Conversion({
async function getFiatResponse() {
try {
const tokenId = symbol === 'MATIC' ? 'matic-network' : 'ethereum'
const { dollar, euro } = await getFiat(Number(amount), tokenId)
const { dollar, euro } = await getFiat({
amount: Number(amount),
tokenId
})
setConversion({ euro, dollar })
} catch (error) {
console.error((error as Error).message)

View File

@ -1,7 +1,10 @@
import { type CollectionEntry } from 'astro:content'
import { getAllPosts } from './index'
import { slugifyAll } from '../slugify'
export async function getPostsByTag(tag: string) {
export async function getPostsByTag(
tag: string
): Promise<CollectionEntry<'articles' | 'links' | 'photos'>[]> {
const allPosts = await getAllPosts()
return allPosts.filter((post) =>
slugifyAll(post.data.tags || []).includes(tag)

View File

@ -1,7 +1,6 @@
import type { CollectionEntry } from 'astro:content'
import path from 'node:path'
export function getSlug(filePath: string) {
export function getSlug(filePath: string): string {
const parsedPath = path.parse(filePath)
let slug
@ -18,5 +17,5 @@ export function getSlug(filePath: string) {
// remove the date prefix
slug = slug.substring(11)
return slug as CollectionEntry<'articles' | 'photos' | 'links'>['slug']
return slug
}

View File

@ -1,4 +1,4 @@
import { getCollection } from 'astro:content'
import { getCollection, type CollectionEntry } from 'astro:content'
import { readOutExif } from '@lib/exif'
import path from 'path'
import config from '@config/blog.config'
@ -13,7 +13,7 @@ import { getSlug } from './getSlug'
//
export async function loadAndFormatCollection(
name: 'articles' | 'links' | 'photos'
) {
): Promise<CollectionEntry<'articles' | 'links' | 'photos'>[]> {
let postsCollection = await getCollection(name)
// filter out drafts, but only in production
@ -36,7 +36,7 @@ export async function loadAndFormatCollection(
const githubLink = `${config.repoContentPath}/${post.collection}/${post.id}`
post.slug = slug
post.slug = slug as CollectionEntry<'articles' | 'links' | 'photos'>['slug']
post.data.date = date
post.data.githubLink = githubLink

View File

@ -24,7 +24,7 @@ export function formatGps(gpsData: FastExif['gps']): {
return { latitude, longitude }
}
export function formatExposure(exposureMode: number) {
export function formatExposure(exposureMode: number): string {
if (!exposureMode || exposureMode === 0) return `+/- 0 ev`
const exposureShortened = parseFloat(exposureMode.toFixed(2))

View File

@ -3,7 +3,7 @@ import { markdownToHtml } from '../markdown'
export async function getFeedContent(
post: CollectionEntry<'articles' | 'photos' | 'links'>
) {
): Promise<string> {
const footer =
'<hr />This post was published on <a href="https://kremalicious.com">kremalicious.com</a>'
const content = await markdownToHtml(post.body)

View File

@ -1,4 +1,4 @@
export async function getRepo(name: string) {
export async function getRepo(name: string): Promise<any> {
// name comes in as user/repo
const user = name.split('/')[0]
const repo = name.split('/')[1]

View File

@ -1,6 +1,6 @@
import { createMarkdownProcessor } from '@astrojs/markdown-remark'
export async function markdownToHtml(markdown: string) {
export async function markdownToHtml(markdown: string): Promise<string> {
const processor = await createMarkdownProcessor()
const { code } = await processor.render(markdown)
return code

View File

@ -1,6 +1,9 @@
import slugifyLib from 'slugify'
export const slugify = (text: string) =>
slugifyLib(text, { lower: true, remove: /[*+~.()'"!:@]/g })
export function slugify(text: string): string {
return slugifyLib(text, { lower: true, remove: /[*+~.()'"!:@]/g })
}
export const slugifyAll = (arr: string[]) => arr.map((str) => slugify(str))
export function slugifyAll(arr: string[]): string[] {
return arr.map((str) => slugify(str))
}

View File

@ -1,4 +1,7 @@
export function getUmamiConfig(env = import.meta.env) {
export function getUmamiConfig(env = import.meta.env): {
UMAMI_SCRIPT_URL: string
UMAMI_WEBSITE_ID: string
} {
const UMAMI_SCRIPT_URL = env.PUBLIC_UMAMI_SCRIPT_URL
const UMAMI_WEBSITE_ID = env.PUBLIC_UMAMI_WEBSITE_ID
const isProduction = env.PROD