添加链接
link管理
链接快照平台
  • 输入网页链接,自动生成快照
  • 标签化管理网页链接
Been playing last weekend with GPT-Neo-2.7B (2.7 Billion parameters), a pre-trained model from Eleuther.AI (open source, academic foundation?). If you didn't try any GPT (Generative Pre-trained Transformer) yet, then you may be surprised by what it can do. There are 3 sizes for the GPT-Neo, the 2.7B one is 10GB and all the rest another 5GB or so of installs.
Installing and running a Hello World of it with Python and Linux was made trivial. Didn't try with FreBSD, don't have 15GB free in FreeBSD, sorry for the sudo apt .
Install
sudo apt install python3 python3-pip
pip3 install torch torchvision torchaudio transformers
Run from Python3
from transformers import pipeline
generator = pipeline('text-generation', model='EleutherAI/gpt-neo-2.7B')
text_seed = "A horse, a frog and a giraffe walk into a bar."
ai_answer = generator(text_seed, max_length=128, do_sample=True, temperature=0.7)
print( ai_answer[0]['generated_text'] )
At first run it will auto-download the 10GB model (by default in ~/.cache/huggingface/transformers, the smallest pretrained model that can be downloaded is gpt-neo-125M).
The generated answers can be quite surprising. To me It was scary good, uncanny and unsettling. :oops:
The same model with the 3 lines above was able to:
- generate plain English, so I've tried to bamboozle it and put some French in the text_seed=" ... "
- and it speak back in French! o_O
- tried Romanian, and it could answer in plain Romanian, too
- it even speaks "programming", tried Python and it returned Python lines
If you never did this before, you may want to give it a run, and see for yourself. It's not the ultimate thing, but this model is a free one and from a couple of years ago. It already has much bigger brothers, from either closed or open source implementations.

Some random answers from GPT-Neo-2.7B
text_seed = "A horse, a frog and a giraffe walk into a bar."
A horse, a frog and a giraffe walk into a bar. The frog says, “It’s been a while since I’ve had a drink, but I’ll have one.” The giraffe says, “I’m going to have one, too.” And the bartender says, “How about a drink for the whole group?” The horse says, “Are you going to pay for this?” And the bartender says, “No, it’s on me.” text_seed = "La mécanisation de l'agriculture a changé la vie de nos villages."
L'économie est devenue stratosphère, la croissance économique était née de la mécanisation de l'agriculture et de l'élevage, les salaires augmentent et les coûts de production sont plus bas. text_seed = "Unde dai şi unde crapă."
Unde dai şi unde crapă. În ciuda faptului că, în ciuda unui astfel de scop, ceva întâmplător se va întâmpla, cineva va întâlni, într-un post de televiziune, un politicien care-l va acuza pe un lider politic de corupţie, şi cineva va spune, în numele lui, ... text_seed = "def palindrome(string): # returns the string backwards"
def palindrome(string):     # returns the string backwards in order
    for i, char in enumerate(string):
        if char!= string[-i - 1]:
            return string[:i] + string[-i - 1]
    return string
Usage:
print(palindrome("I ate a piece of pie yesterday")[::-1])
Output:
I ate a piece of pie yesterday
Didn't read any papers yet, only fooled around with an online demo of 'DALL·E mini' that generates images by words description: https://huggingface.co/spaces/dalle-mini/dalle-mini Results were between funny and nightmare-ish looking pics. :)
I remember Eliza's answers were based on online searches. While the generated text from Eliza and GPT look similar, PyTorch/Transformers/GPT-Neo is running locally, no Internet required.
What I like the most with GPT generators is that they can generate code (for basic Python at least, generating C doesn't work that well). This code suggestions can be very helpful for someone like me, not a programmer by profession, yet having to write small pieces of code once in a while. It can spare a lot of Stackoverflow searches, and doing that while running locally. Doesn't depend on remote services from a 3rd party that can go off.

Result​

FreeBSD 11.0-RELEASE Now Available -----BEGIN PGP SIGNED MESSAGE----- Hash: SHA512 The FreeBSD 11.0-RELEASE release is now available via: ftp://ftp.FreeBSD.org/pub/FreeBSD/ ############################################################################ FreeBSD 11.0-RELEASE - 496 packages available ############################################################################ The release notes for FreeBSD 11.0-RELEASE are available from the ftp://ftp.FreeBSD.org/pub/FreeBSD/releases/11.0/relnotes/. ISO images for the CD-ROM and This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
By continuing to use this site, you are consenting to our use of cookies. Accept Learn more…