@oeplse has joined the channel
Hi Racketeers! I’m having a problem with building an xexpr.
First, I use the markdown library to generate a list of list of xexprs from markdown files.
(define (article-paths->xeprs article-paths)
(map parse-markdown article-paths))
Then, I want to serve the result as html.
(define (start request)
(response/xexpr
`(html (head (title ,(path->string (first fnames))))
(body ,(first articles)))))
I get his error message: response/xexpr: contract violation;
Not an Xexpr. Expected an attribute value string, given '((id "test-1"))
And this is the xexpr that is built: '(html
(head (title "test1.md"))
(body
((h1 ((id "test-1")) "Test 1")
(p () "Test.")
(h2 ((id "subtest")) "Subtest")
(h1 ((id "testovich")) "Testovich")
(p () "Another one tests the dust."))))
Whole program: https://gist.github.com/turbinenreiter/309fa8750489c0fe3205d16e0f1a3f5e
I’m not sure what to make of the error message. What I’m passing in is an xexpr, at least I think so :thinking_face:
@oeplse try removing first open parens from ((h1 ((id "test-1"))...
and its closing parens at the end
as it is it treats h1 as an attribute (and not an element) and expects its value hence waiting for string
that makes sense. how do I do that? this part of the expression is generated by the markdown parser. I tried (first (first articles))
- but gave me only the (h1 ...)
back.
@oeplse try replacing (body ,(first articles))
with (foldl cons ,(first articles) '(body))
@oeplse parse-markdown
returns a list of xexprs (not just one).
So (first articles)
is a list of x-expressions.
So I think you want to use unquote-splicing
a.k.a. ,@
instead of unquote
a.k.a. ,
That is: (define (start request)
(response/xexpr
`(html
(head (title ,(path->string (first fnames))))
(body ,@(first articles)))))
@greg that’s much nicer solution :slightly_smiling_face:
@githree Thank you. But. This was clear to me only because I’ve done it so many times. (Having written the markdown pkg and used it in various things including Frog.) ¯_(ツ)_/¯
:+1: That works!
So, unquote-splicing
is like unquote
, but with the result not being it’s own list but spliced into the surrounding list?
@oeplse Yes.
@ghoetker you can use plutil -p
to just print the plist w/o converting the actual file. Capturing that and then parsing the XML in racket should be easy after that.
The man page I have for plutil
claims that -p
“[prints] the property list in a human-readable fashion. The output format is not stable and not designed for machine parsing. The purpose of this command is to be able to easily read the contents of a plist file, no matter what format it is in,” which seems not-so-useful if you want to automate things.
oops. yup. I forgot that it uses a quasi-json like output for that and not xml. I might be thinking of a different (older) tool.
plutil -convert xml1
does output XML, which is what @ghoetker was already using.
(And -o -
outputs to stdout.)
yup. Found what I was confusing it with. /usr/libexec/PlistBuddy -x -c Print
as far as long lived processes go… you could bypass the costs of subprocesses and use libplist
directly… I’ve not done FFI in racket yet, but I imagine the effort might be worth it if you’re going to churn through tens of thousands of plists
less than that and I probably wouldn’t bother w/o lots of measurement
I’m converting something I’ve previously written in Applescript to Racket. I’m struggling with the parsing of JSON. Having retrieved some JSON—cleverly called theJSON—from http://crossref.org\|crossref.org, the key part of my applescript is this, which I think is self-explanatory.
repeat with i from 1 to 3
set theCitation to (fullCitation of item i of theJSON)
set theDOI to (doi of item i of theJSON)
...Do something with theCitation and theDOI..
end repeat
I am stumped at replicating this in Racket. In particular, I’m not clear (a) how to get at item 1, 2 or 3 within the JSON and (b) having done so, how to get at their values for “doi” and “fullCitation”. Much Googling and reading the docs has left me still confused.
Here is the raw JSON and the corresponding jsexpr (just 2 items to save space).
"[\n {\n \"doi\": \"<http://dx.doi.org/10.1007/978-3-540-85582-8_14\>",\n \"score\": 18.289448,\n \"normalizedScore\": 100,\n \"title\": \"Smart Business Network in Non-Modular Industries\",\n \"fullCitation\": \"Johannes Meuer, 2009, 'Smart Business Network in Non-Modular Industries', <i>The Network Experience</i>, pp. 211-228\",\n \"coins\": \"ctx_ver=Z39.88-2004&amp;rft_id=info%3Adoi%2Fhttp%3A%2F%2Fdx.doi.org%2F10.1007%2F978-3-540-85582-8_14&amp;rfr_id=info%3Asid%2Fcrossref.org%3Asearch&amp;rft.atitle=Smart+Business+Network+in+Non-Modular+Industries&amp;rft.jtitle=The+Network+Experience&amp;rft.date=2009&amp;rft.spage=211&amp;rft.epage=228&amp;rft.aufirst=Johannes&amp;rft.aulast=Meuer&amp;rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Abook&amp;rft.genre=bookitem&amp;rft.au=Johannes+Meuer\",\n \"year\": \"2009\"\n },\n {\n \"doi\": \"<http://dx.doi.org/10.3403/30179561u\>",\n \"score\": 18.124527,\n \"normalizedScore\": 99,\n \"title\": \"Aerospace series. Modular and open avionics architectures\",\n \"fullCitation\": \"'Aerospace series. Modular and open avionics architectures'\",\n \"coins\": \"ctx_ver=Z39.88-2004&amp;rft_id=info%3Adoi%2Fhttp%3A%2F%2Fdx.doi.org%2F10.3403%2F30179561u&amp;rfr_id=info%3Asid%2Fcrossref.org%3Asearch&amp;rft.atitle=Aerospace+series.+Modular+and+open+avionics+architectures&amp;rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Aunknown&amp;rft.genre=unknown\",\n \"year\": null\n }\n]"
'(#hasheq((fullCitation . "Johannes Meuer, 2009, 'Smart Business Network in Non-Modular Industries', <i>The Network Experience</i>, pp. 211-228")
(doi . "<http://dx.doi.org/10.1007/978-3-540-85582-8_14>")
(year . "2009")
(normalizedScore . 100)
(score . 18.289448)
(title . "Smart Business Network in Non-Modular Industries")
(coins
.
"ctx_ver=Z39.88-2004&amp;rft_id=info%3Adoi%2Fhttp%3A%2F%2Fdx.doi.org%2F10.1007%2F978-3-540-85582-8_14&amp;rfr_id=info%3Asid%2Fcrossref.org%3Asearch&amp;rft.atitle=Smart+Business+Network+in+Non-Modular+Industries&amp;rft.jtitle=The+Network+Experience&amp;rft.date=2009&amp;rft.spage=211&amp;rft.epage=228&amp;rft.aufirst=Johannes&amp;rft.aulast=Meuer&amp;rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Abook&amp;rft.genre=bookitem&amp;rft.au=Johannes+Meuer"))
#hasheq((fullCitation . "'Aerospace series. Modular and open avionics architectures'")
(doi . "<http://dx.doi.org/10.3403/30179561u>")
(year . null)
(normalizedScore . 99)
(score . 18.124527)
(title . "Aerospace series. Modular and open avionics architectures")
(coins
.
"ctx_ver=Z39.88-2004&amp;rft_id=info%3Adoi%2Fhttp%3A%2F%2Fdx.doi.org%2F10.3403%2F30179561u&amp;rfr_id=info%3Asid%2Fcrossref.org%3Asearch&amp;rft.atitle=Aerospace+series.+Modular+and+open+avionics+architectures&amp;rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Aunknown&amp;rft.genre=unknown")))
Thank you very much for any help!
@ghoetker shouldn’t you have an list of hashes for your result? You CAN enumerate over 0, 1, 2 and grab the nth items from the collection, but that’s more of a literal translation from how you’d do it in applescript.
Another approach would be something like take
combined with for
. Whipping up something close
(for ([citation (take 3 citations)])
(let ([full (hash-ref citation 'fullCitation)]
[doi (hash-ref citation 'doi)])
;; do something with these
))
(completely untested)
@zenspider Thank you very much. Your solution worked with one quick fix (the ordering of take
is actually the list then how many to take). The code I ended writing was this, for anyone’s future reference:
(define (process-search a-jsexpr)
(for ([citation (take a-jsexpr 2)])
(let ([full (hash-ref citation 'fullCitation)]
[doi (hash-ref citation 'doi)])
(fprintf (current-output-port) "Full citation: ~a, DOI: ~a" full doi))))
The fprintf
is just to confirm it worked. I actually do more interesting things with it.
Also, your question “Shouldn’t you have an list of hashes for your result?” intrigued me because I’m not quite sure what you meant. If you have to expand on it, I’d love to learn more. If not, you’ve already solved my issues and taught me something cool. Thanks!