API.Market
Go to API.market
  • Welcome to API.market
  • What are API Products?
  • How to subscribe to a SaaS API Product?
  • Managing Subscriptions
  • Analytics & Logs
  • How can I cancel my Subscription?
  • How do I add payment details?
  • How does API.market charges me?
  • Error Codes
  • Seller Docs
    • API Seller Console
    • What is an API Product?
    • What is a Pricing Plan
    • Importing an API Source
    • Creating a Product using the Wizard
    • Testing Your APIs & Products
    • Analytics & Logs
    • Custom Usage
    • Overriding Custom Usage on Result Retrieval
  • FUNDAMENTALS
    • Convert Postman Collection to OpenAPI Yaml
    • Create OpenAPI spec using ChatGPT
  • About Us
  • API Product Docs
    • MagicAPI
      • Screenshot API
      • Domain Availability Checker API
      • WhoIS API
      • PDF Conversion API
      • Image Upscale API
      • DNS Checker API
      • Ageify API
      • Image Restoration API
      • Toon Me API
      • Coding Assistant
      • 🎭 FaceSwap API: Instantaneous replaces face with one another
      • 🏞️ Image Upload API
      • Deblurer API
      • Hair Changer API
      • 🤳🏻🤖AI Qr Code Generator API
      • Whisper API
      • Image Colorizer API
      • OpenJourney API
      • Object Remover API
      • Image Captioner API
      • Object Detector API
      • NSFW API
      • Crunchbase API
      • Pipfeed's Extract API Developer Documentation
      • Migrating from Capix FaceSwap API to magicapi/faceswap-capix API
    • BridgeML
      • Meta-Llama-3-8B-Instruct
      • Meta-Llama-3-70B-Instruct
      • Mistral-7B-Instruct-v0.1
      • Mixtral-8x22B-Instruct-v0.1
      • Meta-Llama-2-7b
      • Meta-Llama-2-13b
      • Meta-Llama-2-70b
      • Gemma-7b-it
      • NeuralHermes-2.5-Mistral-7B
      • BAAI/bge-large-en-v1.5
      • CodeLlama-70b-Instruct-hf
      • 🤖🧗Text-to-Image API
      • 📝🎧 Text to Audio API
    • Capix AI
      • FaceSwap Image and Video Face Swap API
      • MakeUp
      • Photolab.me
      • AI Picture Colorizer
      • AI Picture Upscaler
      • AI Background Remover
      • Object Remover
      • TTS Universal
      • Home GPT
      • AI & Plagiarism Checker
      • AI Story Generator
      • AI Essay Generator
      • Book Title Generator
    • Trueway
      • ⛕ 🗺️ Trueway Routing API
      • 🌐📍Trueway Geocoding API: Forward and Reverse Geocoding
      • 🛤️ ⏱️Trueway Matrix API: Travel Distance and Time
      • 🏛️ Trueway Places API
    • AILabTools
      • Cartoon-Yourself
    • SharpAPI
      • 📄 AI-Powered Resume/CV Parsing API
      • 🛩️ Airports Database & Flight Duration API
    • Text to Speech
      • Turn your text into Magical-sounding Audio
Powered by GitBook
On this page
  • Training Data
  • Request and Response
  1. API Product Docs
  2. BridgeML

Meta-Llama-3-70B-Instruct

Meta developed and released the Meta Llama 3 family of large language models (LLMs), a collection of pretrained and instruction tuned generative text models in 8 and 70B sizes.

PreviousMeta-Llama-3-8B-InstructNextMistral-7B-Instruct-v0.1

Last updated 10 months ago

Developer Portal :

LLama 3 - 70B

This cheap LLM API was developed by Meta and released the Meta Llama 3 family of large language models (LLMs), a collection of pre-trained and instruction-tuned generative text models in 8 and 70B sizes. The Llama 3 instruction-tuned models are optimized for dialogue use cases and outperform many of the available open-source chat models on common industry benchmarks.

Input: Models input text only.

Output: Models generate text and code only.

Model Architecture: Llama 3 is an auto-regressive language model that uses an optimized transformer architecture. The tuned versions use supervised fine-tuning (SFT) and reinforcement learning with human feedback (RLHF) to align with human preferences for helpfulness and safety.

Params
Context length
Token count
Knowledge cutoff

70B

8K

15T+

December, 2023

Intended Use Cases Llama 3 is intended for commercial and research use in English. Instruction-tuned models are intended for assistant-like chat, whereas pre-trained models can be adapted for a variety of natural language generation tasks. This is an easy-to-use LLM API and cheap LLM with cost of $1.20 per million tokens.

Carbon Footprint Pretraining utilized a cumulative 7.7M GPU hours of computation on hardware of type H100-80GB (TDP of 700W). Estimated total emissions were 2290 tCO2eq, 100% of which were offset by Meta’s sustainability program.

Time (GPU hours)
Power Consumption (W)
Carbon Emitted(tCO2eq)

6.4M

700

1900

Training Data

Overview Llama 3 was pre-trained on over 15 trillion tokens of data from publicly available sources. The fine-tuning data includes publicly available instruction datasets, as well as over 10M human-annotated examples. Neither the pretraining nor the fine-tuning datasets include Meta user data.

Data Freshness The pretraining data has a cutoff of March 2023 for the 8B and December 2023 for the 70B models respectively.

Request and Response

Request

curl -X 'POST' \
  'https://api.magicapi.dev/api/v1/bridgeml/meta-llama3-70b/bridgeml/meta-llama3-70b' \
  -H 'accept: application/json' \
  -H 'x-magicapi-key: API_KEY' \
  -H 'Content-Type: application/json' \
  -d '{
  "messages": [
    {
      "role": "user",
      "content": "hello"
    },
    {
      "role": "assistant",
      "content": "As a senior software engineer create a AWS python lambda that calls an API with the given curl request and returns a response, take the input from the function parameters called events, example curl request:\r\ncurl -X GET \\\r\n  '\''https://api.magicapi.dev/api/v1/magicapi/whois/whois/google.com'\'' \\\r\n  -H '\''accept: application/json'\'' \\\r\n  -H '\''x-magicapi-key: api-key"
    }
  ],
  "temperature": 1,
  "max_tokens": 256,
  "top_p": 1,
  "frequency_penalty": 0,
  "stream": false
}'

Response

{
  "id": "meta-llama/Meta-Llama-3-70B-Instruct-afc4c52e-ebab-4e8a-bebf-108ecf511758",
  "object": "text_completion",
  "created": 1718783159,
  "model": "meta-llama/Meta-Llama-3-70B-Instruct",
  "choices": [
    {
      "message": {
        "role": "assistant",
        "content": "Here is an example of an AWS Python Lambda function that calls the API with the given curl request and returns a response:\n```\nimport boto3\nimport json\nimport requests\n\ndef lambda_handler(event, context):\n    # Extract the API key from the environment variables\n    api_key = 'YOUR_API_KEY_HERE'\n\n    # Extract the domain from the event\n    domain = event['domain']\n\n    # Construct the API URL\n    url = f'https://api.magicapi.dev/api/v1/magicapi/whois/whois/{domain}'\n\n    # Set the headers\n    headers = {\n        'accept': 'application/json',\n        'x-magicapi-key': api_key\n    }\n\n    # Make the GET request\n    response = requests.get(url, headers=headers)\n\n    # Check if the response was successful\n    if response.status_code == 200:\n        # Return the response as JSON\n        return {\n            'statusCode': 200,\n            'body': json.dumps(response.json())\n        }\n    else:\n        # Return an error message\n        return {\n            'statusCode': response.status_code,\n            'body': json.dumps({'error': 'API request failed'})\n        }\n```\nHere's an explanation of the code:\n\n*",
        "tool_calls": null,
        "tool_call_id": null
      },
      "index": 0,
      "finish_reason": "length",
      "logprobs": null
    }
  ],
  "usage": {
    "prompt_tokens": 105,
    "completion_tokens": 256,
    "total_tokens": 361
  }
}

You can try this cheap and easy to use LLM API out here at

https://api.market/store/bridgeml/meta-llama3-70b
https://api.market/store/bridgeml/meta-llama3-70b