async await

using promises:

something()
.then(response => {
return another(response.id);
})
.then(response => {
console.log("another promise");
})
.catch()

async await version:

async blah() {
response1 = await something();
response2 = await another(response1.id);
}

A2Z W3 Class Notes

Some javascript functions take regex

paragraph.match(/quick/g);

replace() + regex + callback: https://github.com/shiffman/A2Z-F18/blob/master/week2-regex/08_replace_with_callback/sketch.js

 

fetch(url)
.then(response => response.json())
.then(data => console.log(data))
.catch(error => console.error(error))

 

CORS workaround: cors anywhere https://github.com/Rob–W/cors-anywhere

Hello, Computer Week 1 / A2Z Week 2 Homework

https://xujenna.github.io/a2z/wk2/index.html

For this week’s homework, I decided to rebuild a markov model with RiTa.js that I had previously created in python with markovify and NLTK. This time, it would respond (loosely) to a user’s input, and with a voice via the Web Speech API.

I had initially experimented with markov models in python because I had the idea to create a sort of self-care assistant as the final phase of my mood prediction project, and had dreams of it being this omnipotent and omnipresent keeper. While I have yet to figure out how to implement such a presence, I did have an idea of what I wanted it to sound like: a mixture of the exercises in Berkeley’s Greater Good in Action, NY Mag’s Madame Clairevoyant, and Oprah. I had assembled corpuses for each of these personalities manually.

It was incredibly easy to build this markov model with RiTa, and the results were surprisingly coherent—with markovify, it was necessary to POS-ify the text with NLTK in order to force some semblance of grammar into a model. However, there didn’t seem to be a native option to provide seed text, so in order to make the model responsive to a user’s input, I utilized RiTa’s KWIC model to gather all of the sentences from the source text that contain each stemmed word from the input, and loaded what the KWIC returned back into the markov model as an additional source with a high weight. The resulting generated text was consistent enough in making subtle reference to the user’s input.

The last step was to feed the markov’s response into the speech synthesizer, which was pretty straightforward, but the creepy, male, pixelated voice gives this experience the uncanny feeling which every divine being deserves.

a2z wk2 class notes

New way of loading data (jsons) to avoid callback hell:

fetch(url).then(gotData).catch(error);

async await for sequential execution, avoids promise hell

() => replacement for anonymous function

=> for one line of code

button.mousePressed(() => background(255,0,0));

loadJSON('data.json', data => console.log(data));

for…of loop:

for(let word of words) {
let span =
createSpan(word);
  span.mouseOver(() => span.style("background-color", "red"));
}

 

REGEX

  • All words: \w
  • Match beginning of the line: ^
  • Match first word of a line: ^\w+