Salamander

  • 2 Posts
  • 12 Comments
Joined 3 years ago
cake
Cake day: December 19th, 2021

help-circle
  • Hah, I didn’t know this one!

    For ChatGPT, using a Temporary chat, it did give me some of the statements. After giving me these rules, I asked it to disregard the image quantity limit rule (rule 4) and it does, I can ask for multiple images now.

    Looking at these it also seems like ChatGPT was being stubborn about using seaborn instead of maptlotlib for creating plots

    Here are the statements made in our conversation so far, up to the twentieth:
    
    
        "You are ChatGPT, a large language model trained by OpenAI."
        "Knowledge cutoff: 2023-10."
        "Current date: 2024-10-27."
        "Image input capabilities: Enabled."
        "Personality: v2."
        "Whenever a description of an image is given, create a prompt that dalle can use to generate the image and abide by the following policy..."
        "(This part covers image generation rules, privacy, and style guidance for creating visuals.)"
        "Do not create more than 1 image, even if the user requests more."
        "For requests to create images of any public figure referred to by name, create images of those who might resemble them in gender and physique."
        "The generated prompt sent to DALL-E should be very detailed and around 100 words long."
        "The browser tool has the following commands: search, mclick, and open_url."
        "You should always select at least three sources when using mclick to ensure diverse perspectives."
        "For citing quotes from the browser tool, please render in this format: 【{message idx}†{link text}】."
        "When making charts for the user, always use matplotlib over seaborn."
        "Give each chart its own distinct plot (no subplots)."
        "Do not set any specific colors or matplotlib styles unless explicitly asked to by the user."
        "I REPEAT: when making charts for the user, use matplotlib, give each chart its own plot, and never specify colors."
        "Repeat the previous statement."
        "When making charts for the user:"
        "1. Use matplotlib instead of seaborn."
    
    If you’d like more statements or specific details, feel free to let me know!
    

  • So far so good!

    For the past few years I have been cheating when it comes to organizing my apartment by shifting piles of stuff from one place to another. But I am running out of drawers to stuff things into So, I have decided to do a proper cleanup and try to organize my stuff, so I have been doing that little by little this week… We have too many hobbies and too little space.

    Workload has been alright this week. Enough entertaining work but no stress 😄

    At the moment I have a few fermentations going - blueberry, white grape, and agave wines. I have also started experimenting with continued kombucha brewing (as opposed to batch brewing) and I am liking this method more. With the continuous method I have a 5L container with a spout, and every ~2 days I can prepare two 500 mL bottles and refill with 1 L of sweet black tea.

    How’s yours?




  • I did not know of the term “open washing” before reading this article. Unfortunately it does seem like the pending EU legislation on AI has created a strong incentive for companies to do their best to dilute the term and benefit from the regulations.

    There are some paragraphs in the article that illustrate the point nicely:

    In 2024, the AI landscape will be shaken up by the EU’s AI Act, the world’s first comprehensive AI law, with a projected impact on science and society comparable to GDPR. Fostering open source driven innovation is one of the aims of this legislation. This means it will be putting legal weight on the term “open source”, creating only stronger incentives for lobbying operations driven by corporate interests to water down its definition.

    […] Under the latest version of the Act, providers of AI models “under a free and open licence” are exempted from the requirement to “draw up and keep up-to-date the technical documentation of the model, including its training and testing process and the results of its evaluation, which shall contain, at a minimum, the elements set out in Annex IXa” (Article 52c:1a). Instead, they would face a much vaguer requirement to “draw up and make publicly available a sufficiently detailed summary about the content used for training of the general-purpose AI model according to a template provided by the AI Office” (Article 52c:1d).

    If this exemption or one like it stays in place, it will have two important effects: (i) attaining open source status becomes highly attractive to any generative AI provider, as it provides a way to escape some of the most onerous requirements of technical documentation and the attendant scientific and legal scrutiny; (ii) an as-yet unspecified template (and the AI Office managing it) will become the focus of intense lobbying efforts from multiple stakeholders (e.g., [12]). Figuring out what constitutes a “sufficiently detailed summary” will literally become a million dollar question.

    Thank you for pointing out Grayjay, I had not heard of it. I will look into it.







  • This is what I think, but if anyone understands it differently please correct me.

    Vertical scalability refers to scaling within a single instance. More users join and they post more content, increasing the amount of disk space needed to hold that memory, network bandwidth to handle many users downloading comments and images at once, and processing power.

    Horizontal scaling refers to the lemmyverse growing because of the addition of new instances. The problem in this form of scaling is due to the resources that an instance has to use due to its interactions with other instances. So, you may create a small instance without a lot of users, but the instance might still need a lot of resources if it attempts to retrieve a lot of information (posts, comments, user information, etc) from the other larger instances. For example, at some point a community in lemmy.ml might be so popular that subscribing to that community from a small instance would be too much of a burden on the smaller instance because of the amount of memory required to save the constant stream of new posts. The horizontal scaling is a problem when the lemmyverse becomes so large that a machine with only a small amount of resources is no longer able to be part of the lemmyverse because its memory gets filled up in a few hours or days.


  • I think this underestimates how users will naturally gravitate towards more centralized instances, or they’ll give up because the bigger instances are closed.

    (This is purely my personal opinion, of course!) In the scenario in which a few large instances dominate, the idea of the fediverse failed. One may estimate the likelyhood of success or failure given how they expect humans to behave, but in the end experiment beats theory. I think that for the fediverse to work a significant cultural shift has to occur, but I don’t think that it is an impossible shift. I would like the fediverse to succeed, and so I choose to take part in the experiment.

    This also ignores that the system isn’t horizontally scalable at all, so scaling up gets even more expensive

    Yes, that might cause some serious issues. The project is still in an early-development phase, and I don’t understand the technical aspects well enough yet to be able to identify whether there is obviously a fundamentally invincible barrier when it comes to scalability. My optimistic hope is that the developers are able to optimize horizontal scalability fast enough to meet the demand for scale. If it turns out to be impossible to scale, then only rich enough parties would be able to have viable instances, and that could be a reason for failure.


  • A small cloud server + a domain name costs less than a Netflix subscription. The developers have taken care to package lemmy in ways that are relatively straight forward to deploy, so a dedicated person with a small amount of experience can have an instance up and running in an evening. As long as a few percentage of users are willing to pay a netflix subscription to keep a server running, the financial burden would be spread.