Replies: 1 comment
-
|
Yes, You need to use import asyncio
from openai import AsyncOpenAI
client = AsyncOpenAI()
async def generate_image(prompt: str):
response = await client.images.generate(
model="dall-e-2",
prompt=prompt,
n=1,
size="1024x1024",
)
return response.data[0].url
async def main():
prompts = [
"A sunset over mountains",
"A cat in space",
"A cyberpunk city",
"A forest in autumn",
]
# This runs them concurrently
results = await asyncio.gather(*[generate_image(p) for p in prompts])
for url in results:
print(url)
asyncio.run(main())Common mistake -- this looks async but runs sequentially: # WRONG: sequential despite using async
for prompt in prompts:
result = await generate_image(prompt) # blocks until each one completesIf you're still seeing sequential behavior with |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
I know that one can use AsyncOpenAI for concurrent queries to GPT/chat models, but does it also work for concurrent image generation?
I'm trying to use the following within an async function for generating multiple images concurrently (and called by another async looper to run multiple of these concurrently):
await dalleclient.images.generate() # where dalleclient is an AsyncOpenAI()But I notice the image requests seem to be getting sent and processed sequentially, i.e. each image is only initiated when the previous one is completed.
Am I doing something wrong, or is AsyncOpenAI only meant for concurrent GPT/chat use?
Thanks!!
Beta Was this translation helpful? Give feedback.
All reactions