• Promptstacks
  • Posts
  • Microsoft's AI ambitions put Google on notice

Microsoft's AI ambitions put Google on notice

Promptstacks hits 10,000 members!

Overview

  • Preamble

  • AI News: Google eats dogfood

  • Why has Google waited to release their product?

  • Have you been living under a rock?

  • Running through your first prompt

  • Promptstacks

  • Product of the week

  • Here’s a short list of other selected prompts from the forum

  • AI News (you didn’t know) 2023

Preamble

What a week it’s been. Today, we’ll cover the latest movements in the AI industry and run our eyes over a few prompts that were posted on the Promptstacks forum that has since grown to over 10,000 members!

Promptstacks.com

AI News: Google eats dogfood

Generative AI can be a little scary sometimes

It’s red alert at Google headquarters, where Satya Nadella emailed his staff asking them to “eat [their] own dogfood”. For us, this might seem like a peculiar email to send out, especially considering that all guns are firing in the AI wars between Google and Microsoft.

In the world of tech, ‘dogfooding’ means consistently using a product that your own company has built. So, when they’re talking about Google employees eating their own dogfood, we’re talking about their staff rapidly testing their own Generative AI large language model, BARD - named after old celtic storytellers.

In short, Bard is an experimental AI conversation service that will tackle the likes of ChatGPT using Google’s Language Model for Dialogue Applications (LaMDA).

LaMDA was announced two years ago, but it has not been made available for public testing, unlike ChatGPT.

Why has Google waited to release their product?

Considering the extensive hype for ChatGPT over the past few months, and the supposed user growth for OpenAI’s model free model to around 100m users - It does beg the question, why hasn’t Google shipped (released) their Bard model earlier, especially considering their company policy to “re-orient the company around AI” at least 6 years ago.

I mean, you would have thought that they’d have released an industry defining product earlier right?

Well maybe not, maybe they didn’t release earlier because something was wrong with their large language model or they didn’t think the public was ready for it. Maybe it has something to do with Blake Lemoine, a software engineer at Google, who was supposedly fired in July 2022, after claiming that LaMDA (the model that will run BARD) was sentient.

Was this a marketing spin for Google or is their system way ahead of OpenAI’s GPT 3.5?

We can’t know until it’s released. But one thing we can expect is aggressive competition between the two behemoths. In the end, it might not matter who moved first in the search engine game.

Google has benefited from taking advantage of mistakes made by Yahoo and other search engines and to everyone, until a few weeks ago - Google was always going to be the pre-eminent search engine.

Now, it’s different, considering Bing (Microsofts search Engine) was re-released as ‘new bing’ yesterday, with ChatGPT’s functions integrated - Bing now has a temporary, but serious user advantage that will save their customers hours of time searching for the exact data they want.

New Bing!

Why? Well firstly, the current system is outdated. Everyone struggles to find the right answer to their question in Google, even more so if they want it to be focussed and specific. You’re flooded by links, ads and incorrect answers that seem to be about another question someone else asked.

Why? Because Google’s current system favours articles and blog posts as the sources of information for particular questions and queries.

For example, if I was to search, how to cook a Pad Thai, I’d find my answers in a blog article that some self-proclaimed chef in Manchester might have written last year. But what if you could get a single, concise answer, written in natural language, with links to read more information? LLM-powered search engines make this possible.

Imagine this in the context of product recommendations. If I was to search now, I’d have to sort through hundreds of links and reviews to find a product that I wanted to buy.

What if I could type into the search bar “Best indoor plant for a room that’s 20 degrees celsius, that also survives with very limited sunlight and doesn’t need watering much AND will never get mouldy roots.” 

Zap, an answer (or completion) is churned out specific to that question, saving you time scrolling and providing you with what you want to know in seconds. It is game-changing and this is why Microsoft has moved so fast to integrate ChatGPT with Bing, it might finally be their opportunity to get their foot ahead against Google in the AI race.

As Satya Nadella told reporters a few days ago "It's a new day in search,” and ultimately, “It's a new paradigm for search [engines]. Rapid innovation is going to come. The race starts today."

Sounds a little like when we went to the moon right?

Have you been living under a rock?

On 30th November 2022, OpenAI introduced ChatGPT to the world.

The foundation engine, GPT-3, underwent pre-training for an equivalent of approximately 300 years, incurring a cost of around $5 million to OpenAI, utilising data from a broad range of sources, including academic papers and Wikipedia up to a set date, 2021. This means it doesn’t have access to Google or data since 2021.

ChatGPT operates on a "frozen" framework and lacks the capacity for continuous learning. During the training process, ChatGPT formed billions of connections between trillions of words, enabling it to predict the next word in a prompt, such as a query or question, with statistical accuracy.

Why is this important? The LLM or large language model is far more effective than previous natural language processing models. See below for a perspective on the parameters it’s been trained on.

Running through your first prompt

Large language models like ChatGPT use a thing called "prompting" or "priming".

Basically, you type in what you're looking for (your question, request, or query) and ChatGPT spits out an answer. Just a heads up, it works best if you give a clear and direct prompt, instead of asking a question. And if you don't like the answer, just start a new chat and try again.

No need for fancy, long prompts - a short and simple one will do the trick.

If you still don’t get it, the really basic way to explain ChatGPT (and the models that it runs on) is that it’s software that likes to finish your sentences for you. You provide it with a starting set of words, and it tries to figure out the most likely set of words that follow from your input. You type in any string of words. It’s very flexible and can talk about anything you want, from creating your shopping list to breaking down thermodynamics for a 4 year old.

It’s software that likes to finish your sentences for you.

Probably ChatGPT

The set of words you provide is called a prompt, and the answer you get back from GPT-3 is called a completion.

Below I’ll take you through using a prompt quickly. We’ll loosely follow a prompt posted on the Promptstacks forum by Farah.

A prompt on Promptstacks

Essentially, it’s a simple market research query. Although, I want to change it slightly so that it’s personalised for me. I want to know what kind of questions I could ask my newsletter subscribers to grow and build my newsletter. How do I do that? I prompt ChatGPT. The text at the top is what I typed as my prompt, and the text below is what ChatGPT returned as the completion:

Using ChatGPT

As you can see above, I’ve asked ChatGPT to utilise it’s knowledge of newsletter research to provide a completion that answers my query or prompt. Good right?

This was a quick snapshot of one of the prompts on the forum, if you want to see more visit Promptstacks below.

Promptstacks

This week our community reached 10,000 users. We’ve had quite a few interesting threads, discussing a range of topics from how best to utilise ChatGPT to prompt engineer and maximise the return of your individual prompts to discussing how best to prompt ChatGPT to return the best marketing copy.

Product of the week:

Our product of the week is Byword. If you’re a marketing or SEO fan - or even an entrepreneur who wants to flood the internet with keyword optimised articles for your site, Byword provides that for you at scale within a few clicks. Essentially, you type in a keyword and it will write you a 1500 word article, with different headlines optimised for the Google algorithm, within seconds. Cool right?

If you want to test it out for free, travel to the site through the link below and get access to 5 free articles generations.

Byword sign-up

Here’s a short list of other selected prompts from the forum

Social Media Content Calendar

Twitter Thread

AI News (you didn’t know) 2023

Google announces a new conversational AI technology it will open up to public testing called “Bard.” Bard, Powered by LaMDA will compete directly with rival ChatGPT.

Apple is using AI for audiobooks.

Claude beta testers share new screenshots.

DeepMind's DreamerV3 can make a bot extract diamonds in Minecraft.

Microsoft's VALL-E can clone voice with just 3 seconds of audio.

DeepMind's Sparrow is entering private beta in 2023.

Meta ReVISE is a lip reading AI model.

GPTZero and GPTZeroX are detecting AI generated text.

GPT-3.5 passes CPA, bar, and medical board exams.

Schools are alarmed by ChatGPT.

Microsoft is incorporating GPT into their products like Bing, Word, Powerpoint and Outlook.

Stable Diffusion is facing two lawsuits.

ChatGPT API waitlist and Microsoft Azure OpenAI Service now available.

Google is rushing AI products for May conference.

AlphaFold is credited for first drug design using AI

Meta CPO hints at AI in Instagram filters.

Scenario.gg raises $6 million in funding.

Magic.dev raises $23m series A to develop a software engineer assistant.

InstructPix2Pix is a "Photoshop with words" tool.

NVIDIA Eye Contact demo attracts attention.

Anthropic raises $300 million in funding.

Character.ai secures $250 million in funding.

ElevenLabs raises $2 million and reminded of 4chan.

Congressman Ted Lieu is advocating for AI regulation.

McKinsey acquires Iguazio for $50 million.

White House task force proposing $2.6 billion for federal AI cloud.

Recent research on watermarking solutions.

Shutterstock is incorporating AI in image generation.

Atomic AI raises $35 million in funding.

CNET is criticised for AI-written articles, offers apology.

Buzzfeed stock doubles with AI announcement.

MusicLM is a music generation model from Google.

Baidu introduces their own ChatGPT.

Fiverr adds AI Services category.

SingSong generates accompaniment from vocals.

If you find this newsletter valuable, share it with a friend, and consider subscribing if you haven’t already.