| Commit message (Collapse) | Author | Age | Files | Lines |
|
|
|
| |
useful for filtering stuff through commands like rdrview
|
| |
|
| |
|
| |
|
|
|
|
|
| |
* pass 0 so e.g. git does not hang
* use sigtstp so e.g. cgi scripts can clean up if needed
|
|
|
|
|
|
|
|
|
|
| |
derived from w3mman2html.cgi, there are only a few minor differences:
* different man page opener command
* use man:, man-k:, man-l: instead of query string to specify action
* no form input (C-lC-uman:pageC-m is faster anyway)
TODO rewrite in Nim so we don't have to depend on Perl...
|
|
|
|
| |
This configuration scheme really is a nightmare to use :(
|
|
|
|
|
|
|
|
|
|
|
|
| |
* Add functions for moving to the beginning/end of words (vi `b', `e').
* As it turns out, there are many possible interpretations of what a
word is. Now we have a function for each reasonable interpretation,
and the default settings match those of vi (and w3m in w3m.toml).
(Exception: it's still broken on line boundaries... TODO)
* Remove `bounds` from lineedit, it was horrible API design and mostly
useless. In the future, an API similar to what pager now has could
be added.
* Update docs, and fix some spacing issues with symbols in the tables.
|
|
|
|
|
|
|
|
|
|
| |
* Fix incorrect internal definition of the fragment percent-encode set
* urlenc, urldec: these are simple utility programs mainly for use
with shell local CGI scripts. (Sadly the printf + xargs solution is
not portable.)
* Pass libexec directory as an env var to local CGI scripts
* Update trans.cgi to use urldec and add an example for combining
it with selections
|
|
|
|
| |
why not
|
| |
|
| |
|
| |
|
|
|
|
|
|
| |
Now it actually does what it was supposed to do.
Also, clarify what it does in config.md
|
|
|
|
| |
hopefully this works
|
| |
|
| |
|
| |
|
|
|
|
|
| |
multipart through local CGI is now supported as well.
(also, fix Cha-Control description inaccuracy)
|
|
|
|
|
|
|
|
|
|
|
|
| |
Now it is (technically) no longer mandatory to link to libcurl.
Also, Chawan is at last completely protocol and network backend
agnostic :)
* Implement multipart requests in local CGI
* Implement simultaneous download of CGI data
* Add REQUEST_HEADERS env var with all headers
* cssparser: add a missing check in consumeEscape
|
|
|
|
| |
error codes are WIP, not final yet...
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
* Add MAPPED_URI_* as environment variables when a request is coming
from urimethodmap
It costs us compatibility with w3m, but it seems to be a massive
improvement over smuggling in the URL as a query string and then
writing an ad-hoc parser for every single urimethodmap script.
The variables are set for every urimethodmap request, to avoid
accidental leaking of global environment variables.
* Move about: to adapters (an obvious improvement over the previous
solution)
|
| |
|
|
|
|
|
|
| |
This is better than %u as it is backwards compatible (i.e. does not rely on
other user agents doing whatever upon encountering an unknown substitution
template.)
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
Now we use a (much simplified) gopher2html binary in libexec,
instead of converting gopher directories to HTML in loader/gopher.
This has two advantages:
* Less ugly conversion logic in the loader module; we can just
convert the file line by line. (The previous converter also had
some correctness issues, that is fixed now as well.)
* If the user desires, they can replace the gopher converter with
another binary using the mailcap mechanism.
The disadvantages are:
* For now, source display is broken. This is a problem with all
mailcap filters in general, and should be fixed in the future. (That
said, the previous version also only displayed the converted HTML
source, which was not really useful anyway.)
* The proper directory structure is required for this to work;
OTOH plenty of work has been done so that this is as frictionless as
possible, so it should not really be a problem.
|
|
|
|
|
|
|
|
|
| |
* Paths are now parsed through an unified code path with some useful
additions like environment variable substitution.
* Fix a bug in parseConfigValue where strings would be appended to
existing arrays (and not override them).
* Fix beforeLast calling afterLast for some reason.
* Add a default CGI directory.
|
|
|
|
|
| |
Default is vi-style, but w3m-style marks work as well; see
bonus/w3m.toml.
|
|
|
|
|
|
|
|
| |
{ & } acts like in vi (except the cursor is not moved to the line
beginning).
No reason to leave externInto undocumented, as it is even used in
the default config.
|
|
|
|
|
|
|
| |
* Get rid of useless targets
* Use real recipes instead of command runner targets
* When given, use environment variables
* Document Makefile stuff in doc/build.md
|
| |
|
|
|
|
|
|
|
|
|
|
|
| |
Without this, setting color-mode using -o required quoting
the values, and then shell-quoting the quotes themselves
(cha -o 'display-color-mode="24bit"').
Instead of more special casing in the TOML parser, we just add aliases
for these enum values that can be parsed using TOML bare string
rules. So now this works:
cha -o display.color-mode=true-color
|
|
|
|
|
|
| |
* add showcase picture
* add link to sourcehut project page (until sourcehut adds one)
* add example usage
|
|
|
|
| |
This is consistent with what w3m does and is way more convenient.
|
| |
|
| |
|
| |
|
| |
|
| |
|
| |
|
| |
|
| |
|
|
|
|
| |
yay
|
|
|
|
|
|
|
|
|
|
|
| |
Add w3m-style local CGI support.
It is not quite as powerful as w3m's local CGI, because it lacks an
equivalent to W3m-control. Not sure if it's worth adding; we certainly
shouldn't allow passing JS in headers, but a custom language for
headers does not sound like a great idea either...
eh, idk. also, TODO add multipart
|
| |
|
| |
|
| |
|
|
|
|
| |
8-bit colors are now supported
|
| |
|
|
|
|
|
|
| |
pandoc can only generate manpage tables from markdown tables, but the
markdown pipe table syntax is horrible. So instead of rewriting our markdown
documentation to use that syntax, just programmatically rewrite it.
|
|
|
|
| |
still needs some work
|