Page 3 of 4
Re: Predictive text toy
Posted: Fri Nov 01, 2019 4:32 am
by alice
I tried it with the opening line of Pride and Prejudice and was presented with a captcha. Make of that what you will.
Meanwhile, let's see if this thing can help me win the pools:
Wednesday 30 October
Scottish Premiership
Celtic 2 St Mirren 0
Hamilton Academical 0 Aberdeen 1
Hibernian 2 Livingston 2
Motherwell 2 Kilmarnock 1
Ross County 0 Rangers 4
St Johnstone 1 Heart of Midlothian 0
... nothing. Ah well, I suppose I'll have to work for another twenty years after all then.
Re: Predictive text toy
Posted: Fri Nov 01, 2019 9:47 am
by chris_notts
This strikes me as pretty creepy somehow...
I love ice cream. I also love to play with baby animals with them. So when I got a message from a guy who said he'd have a birthday party on a Saturday at his house, I went.
I arrived at the man's house at 4 AM and he showed me into his living room. I was just sitting on a couch, holding an ice-cream cooler with little puppy toys in it, when an old man with the face of Bill Murray turned up in a purple suit and hat. He smiled a big grin at me and we talked for a while. He showed me the place, which had been a church and had been turned into a house.
I showed him all the rooms in the house – the bedrooms, the bathroom and the dining room. I sat down, looked him square in the eyes and told him how much I loved it. It seemed a little obvious but it still made me feel happy, so he kept on smiling. When he got done talking, I grabbed my ice cream and went home.
I love him.
Re: Predictive text toy
Posted: Sun Nov 03, 2019 6:02 am
by Qwynegold
It's trying to make up Finnish, but it hasn't understood vowel harmony.
Kun tulva tuli, se tuhosi Suide-puun ja haltiat joutuivat pakenemaan. He päätyivät kolmeen eri paikkaan. Tuolut lään, kaikkaa nautiloid. Suivat tulva vihit, se kyvär pai, lään palkkampe pääjä sotelää ileen. He tulva jusnivat vihit lään paku ei, väyttu lään. Känni vahtäänä lään olläää nou tuleena sissä pakki.
(10) As far as the defendant is concerned, he was not only an individual but was also in the service of the state of Minätajokiö. He was also involved at various times in the affairs of the department of finance and finance officials, also the government of Minätajokiö and on numerous occasions in the military activity
Re: Predictive text toy
Posted: Tue Nov 05, 2019 11:49 am
by Pabappa
The site seems to have its ups and downs, but it's been up every time I've used it since my last post here. Got a new one:
I had to check to see if the Twitter account it made up was real.
Re: Predictive text toy
Posted: Tue Nov 05, 2019 1:35 pm
by Travis B.
Pabappa wrote: ↑Tue Nov 05, 2019 11:49 am
I had to check to see if the Twitter account it made up was real.
I just had to check that too.
Re: Predictive text toy
Posted: Tue Nov 05, 2019 3:31 pm
by Pabappa
Also, it seems they upgraded the algorithm on November 1st from the cut-down form to the full program. That may mean that it's running more slowly now, and may explain why sometimes it goes down.
Re: Predictive text toy
Posted: Tue Nov 05, 2019 3:53 pm
by bradrn
Pabappa wrote: ↑Tue Nov 05, 2019 3:31 pm
Also, it seems they upgraded the algorithm on November 1st from the cut-down form to the full program. That may mean that it's running more slowly now, and may explain why sometimes it goes down.
I thought the full program wasn’t available publicly?
EDIT: Apparently
the full-sized program has indeed been released.
Re: Predictive text toy
Posted: Tue Nov 05, 2019 4:49 pm
by Pabappa
bradrn wrote: ↑Tue Nov 05, 2019 3:53 pm
Pabappa wrote: ↑Tue Nov 05, 2019 3:31 pm
Also, it seems they upgraded the algorithm on November 1st from the cut-down form to the full program. That may mean that it's running more slowly now, and may explain why sometimes it goes down.
I thought the full program wasn’t available publicly?
Well, Im not sure if you need developer access or if literally anyone can download it, but the blurb at
https://talktotransformer.com/ confirms that it is now running on the full-sized GPT-2 program.
Re: Predictive text toy
Posted: Tue Nov 05, 2019 5:26 pm
by Travis B.
Also, it seems they upgraded the algorithm on November 1st from the cut-down form to the full program. That may mean that it's running more slowly now, and may explain why sometimes it goes down. It took me a while to figure out that this was the case.
Here's a video of a recent update that was happening in my program.
Here's the video in full.
I can't wait for the next update to come out so I can try to figure out why my program can't keep up with the rest of the field and how to fix it.
If anyone wants to join me in trying to figure out what's up with the cut-down form and how I can fix it, I'm here and on Slack.
Re: Predictive text toy
Posted: Tue Nov 05, 2019 9:18 pm
by bradrn
I just came across another website which implements GPT-2:
http://textsynth.org/. It seems to be quite a bit slower than the website we’ve been using though.
Re: Predictive text toy
Posted: Sat Nov 09, 2019 5:34 am
by Xwtek
It can even make a fictional story:
transformer wrote:
Do you know about the Sakha tribe? I once lived among them. In fact, a significant amount of my teenager life is spent there. When I was a young child, we were on a horse-drawn cart. The Sakha were very welcoming. When we got to our village, they asked if I wanted to stay with them. We did so, and the Sakha tribe welcomed us with a large meal. They showed us the way home.
The day after my visit to the Sakha, I was at a friend's house. I was invited to go with her to the temple of the Sakha. I refused. She tried to persuade me, saying it would be good to go to school in the nearby town of Yakhni.
"I won't go with you, you know," I said. "If you can go to school in Yakhni, then I am willing to go to school
Well, it's not perfect. It's been established that "a significant amount of my teenager life is spent there", but then, "When I was a young child". And I'm still confused with the last sentence.
Re: Predictive text toy
Posted: Mon Nov 18, 2019 11:36 am
by Pabappa
We may be fast approaching the era in which neural networks are used to generate articles that look like news stories but really just serve as a way to get us to look at ads .... or we may be already there. A crop of sites like maternityweek.com, petsdetective.com, atlanticmirror.com, etc have essentially identical content despite their names, with only a slight focus on the subject matter. e.g. Maternity Week features some stories aimed at mothers, or at least about mothers, but nothing to do with pregnancy ,care for newborn babies, etc .... it looks as if they just took the word "maternity" as a synonym for mother. Likewise Pets Detective has very few stories about pets, and a lot of stories about large wild animals such as moose and whales. So it's as if they took the word "pet" as a synonym for animal.
Really sloppy design but I suspect they dont care at all because even calling their stories "stories" is far too kind, as they have little meaningful content. If there's any effort at all in putting the stories together, it's to get the user to scroll as far as possible down the page waiting to find the content that actually relates to the headline. In the meantime the user will of course have scrolled past a bunch of advertisements.
These sites are generated by Scribol Publishing, a division of Pub Ocean, an advertising agency that has been around quite a while. So I suspect that they know exactly what they're doing and that the sites are designed this way on purpose, and that they could easily produce a hundredfold more of them if they wanted to, since domain names are cheap and the sites are essentially all the same under the hood.
So, is this the future of neural network programming? I hope not. Like any technology it has good uses and bad uses. I knew someone who was working with NN programming and might have more information, .... and may have even worked on GPT himself, but I havent seen him online in nearly a year and I dont know any way to contact him.
Re: Predictive text toy
Posted: Wed Nov 20, 2019 8:11 pm
by bradrn
I was just reading
this article about GPT-2, and tried the website we’ve been using with their ‘Quitterbread bars’ prompt. After a couple of tries I got this:
GPT-2 wrote:
Quitterbread bars.
Ingredients:
1 large egg
2 tbsp water
1/2 tbsp coconut flour
2 tsp baking powder
1/2 tsp baking soda
1/2 tsp salt
1/4 cup almond milk (can sub soy or almond milk)
1/4 cup palm oil
Preheat oven to 350F.
In a medium bowl, whisk together the egg, water, coconut flour, baking powder, baking soda, and salt. Set aside. In a separate bowl, whisk together the almond milk and palm oil. Whisk in the egg mixture. Pour into a 9″ x 13″ pan and bake for 30 minutes.
Cool on a wire rack before serving.
*– For making this recipe, I used my almost fully broken soy milk, which worked much better than the almond milk. It still worked though!
Honestly, I find this really, really impressive. In particular, I’m impressed by the way that every ingredient listed is used, and the recipe even uses them in order. It also remembers it set aside an ‘egg mixture’, which needs to be mixed back in.
(I do wonder how this would taste though…)
Re: Predictive text toy
Posted: Wed Nov 20, 2019 8:42 pm
by Vijay
What the hell kind of awful recipes have you been reading?
Maybe I should try posting my recipe for spiced buttermilk.
Re: Predictive text toy
Posted: Wed Nov 20, 2019 8:57 pm
by bradrn
Vijay wrote: ↑Wed Nov 20, 2019 8:42 pm
What the hell kind of awful recipes have you been reading?
Maybe I should try posting my recipe for spiced buttermilk.
Spiced buttermilk, you say? Here you go:
GPT2 wrote:
Spiced buttermilk.
Ingredients:
1-14.5oz can Dr. Seuss's Cinnamon Pudding Mix
4 Tbs melted butter
1/4 cup granulated sugar
2 cups half-and-half (or milk)
4 medium sized eggs
1/2 tsp salt
1/2 tsp cinnamon
Directions:
Heat oven to 350 degrees Fahrenheit. In a small saucepan over medium-high heat, melt butter. Stir in sugar, cinnamon, and half-and-half. Pour into a 9-inch square baking dish or baking pan. Pour in Dr. Seuss's Cinnamon Pudding Mix. Bake for 15-18 minutes or until set. Cool before eating.
Nutrition Facts:
After many tries, this was the first complete recipe I got, and it looks perfectly awful! I don’t even want to know what’s in that Dr. Seuss's Cinnamon Pudding Mix… Thankfully, a quick Google search shows that it probably doesn’t exist.
Re: Predictive text toy
Posted: Wed Nov 20, 2019 9:36 pm
by Vijay
Oh, lol, I forgot what thread this was!
Re: Predictive text toy
Posted: Thu Nov 21, 2019 8:23 am
by alynnidalar
Yet more evidence of how good GPT-2 is. Upon merely skimming the recipe, it sounds plausible--it's only when I slow down and think about the ingredients that I start noticing that it's off. (and the bit where it suggests subbing out a different type of milk? Obviously nonsensical--ah yes let me replace this almond milk with almond milk--but the fact that it suggests two real types of milk that are, in fact, often used as substitutions, is impressive in and of itself!)
Humorously, I've noticed that when I try to prompt it for a recipe, it often gives me a rambling story tangentially related to the recipe title instead... just like real recipe blogs. The problem with a neural network trained on the internet is that it's really good at mimicking the internet.
EDIT: I take back everything I ever said. It just suggested a recipe that includes "riced" tomatoes. Explain to me how you rice a tomato???
It also produced this list of Thanksgiving essentials. The robots like sage and cranberries, apparently. (lightly formatted for ease of reading):
5 Essentials for Thanksgiving Dinner
1. Peanut Butter Brownies with Maple Bacon
2. Apple Pie with Caramelized Onions and Sage
3. Spiced Apple Crisp
4. Bourbon Nuts
5. Meatloaf
6. Turkey with Sage and Cranberries with Sage-Cranberry Sauce
7. Brussels Sprouts with Sage and Cranberries
8. Black Eyed Peas with Cranberries and Sage & Cranberry Sauce
9. Herb Butter Bean Tofu with Sage, Cranberry, and Red Onion
10. Brussels Sprouts and Brown Rice
11. Christmas Eve with Cranberries and Sage
12. Kale and Cranberry Sauce
Re: Predictive text toy
Posted: Thu Nov 21, 2019 9:25 am
by Xwtek
It's completely broken, though if I try to use Indonesian text on it.
GF-2 wrote:
Halo, nama saya Andreas. Saya adalah murid kelas 5 SD Mahardika Bandung.
Ang nabarap-kapangkat wa aba hereng di nasa kopinda ang dapat tama kaya ang kandidato at penkata. Kinabuha saad walang pekeng gumawa kami. Alam rin nang, sagunang mga aking mahulang kami ng pagtosdaman ang kahit ano kakamali sa buong kalayaan. Kapag naglalahan na siyang bahagi andar lahat ng pinagtugtupay ng aking geoli ng gumagabangga. Just at mang pang alagaat di naman kaming tarek
For some reason, the output looks like it's Filipino. (Although I need Filipino natives to confirm this).
There is also funny output:
GF-2 wrote:
Hola, mi nombre es Andreas. Soy de Madrid.
Notice how similar these letters are in style, too? That is why I considered both versions together and chose the letter for thematic purposes.
Compare the following examples with the same notes from the German version:
Hola, mi nombre es Andreas. Soy de Madrid.
A look at the letters may be helpful if you want to read about the characters on your own but without the traditional adaptation. If you want to study all the possibilities of playing the music on your own you should play the book!
Contemporary publishing
The third edition was published in the US in 2015. You can buy it here.
I'd love to publish this book in the US and abroad. In fact, I've started an initiative to promote it in the US. See details
"Hola, mi nombre es Andreas. Soy de Madrid" is Spanish, not German.
Re: Predictive text toy
Posted: Thu Nov 21, 2019 9:56 am
by Vijay
I don't know of anyone in a Spanish-speaking country named "Andreas," though.
Re: Predictive text toy
Posted: Thu Nov 21, 2019 4:47 pm
by bradrn
Xwtek wrote: ↑Thu Nov 21, 2019 9:25 am
It's completely broken, though if I try to use Indonesian text on it.
GF-2 wrote:
Halo, nama saya Andreas. Saya adalah murid kelas 5 SD Mahardika Bandung.
Ang nabarap-kapangkat wa aba hereng di nasa kopinda ang dapat tama kaya ang kandidato at penkata. Kinabuha saad walang pekeng gumawa kami. Alam rin nang, sagunang mga aking mahulang kami ng pagtosdaman ang kahit ano kakamali sa buong kalayaan. Kapag naglalahan na siyang bahagi andar lahat ng pinagtugtupay ng aking geoli ng gumagabangga. Just at mang pang alagaat di naman kaming tarek
For some reason, the output looks like it's Filipino. (Although I need Filipino natives to confirm this).
I don’t speak Filipino either, but this definitely looks like Filipino to me as well, mainly because of all the
ang and
ng.