cURL GET Request: Examples & Options

cURL GET Request: Examples & Options

The cURL GET request is the most fundamental HTTP operation. GET requests retrieve data from a server and are the default method cURL uses. This guide covers every option and scenario for making GET requests with cURL, from basic fetches to advanced configurations.

Basic GET Request

GET is the default HTTP method in cURL — you do not need to specify it explicitly.

# Simple GET request

curl https://api.example.com/users

Explicit GET method (not necessary, but explicit)

curl -X GET https://api.example.com/users

GET with output saved to file

curl -o users.json https://api.example.com/users

GET Request with Query Parameters

# Query parameters in the URL

curl "https://api.example.com/users?page=1&limit=10&sort=name"

URL-encode special characters with --data-urlencode and -G

curl -G \

--data-urlencode "search=hello world" \

--data-urlencode "category=books & media" \

https://api.example.com/search

The -G flag converts -d/--data-urlencode to query parameters instead of POST body

GET with Custom Headers

# Accept JSON response

curl -H "Accept: application/json" https://api.example.com/users

Multiple headers

curl -H "Accept: application/json" \

-H "Authorization: Bearer TOKEN" \

-H "X-Request-ID: 12345" \

https://api.example.com/users

Set User-Agent

curl -A "Mozilla/5.0 (Windows NT 10.0; Win64; x64) Chrome/120.0.0.0" \

https://example.com

Viewing Response Details

# Response body only (default)

curl https://api.example.com/users

Headers only

curl -I https://api.example.com/users

Headers + body

curl -i https://api.example.com/users

Verbose output (shows request and response headers)

curl -v https://api.example.com/users

Show only HTTP status code

curl -s -o /dev/null -w "%{http_code}" https://api.example.com/users

Show status code and content type

curl -s -o /dev/null -w "Code: %{http_code}\nType: %{content_type}\nSize: %{size_download}\n" \

https://api.example.com/users

GET with Authentication

# Basic authentication

curl -u username:password https://api.example.com/protected

Bearer token

curl -H "Authorization: Bearer eyJhbGciOiJIUzI1NiIs..." https://api.example.com/data

API key in header

curl -H "X-API-Key: your-api-key" https://api.example.com/data

API key in query string

curl "https://api.example.com/data?api_key=your-api-key"

GET Through a Proxy

# HTTP proxy

curl -x http://proxy:8080 https://api.example.com/users

Authenticated proxy

curl -x http://user:pass@proxy:8080 https://api.example.com/users

SOCKS5 proxy

curl --socks5-hostname user:pass@proxy:1080 https://api.example.com/users

Using residential proxies with GET requests is common for web scraping, allowing you to rotate IPs and avoid blocks.

Following Redirects

# Follow redirects (3xx responses)

curl -L https://example.com/old-url

Follow redirects with a limit

curl -L --max-redirs 5 https://example.com/old-url

Show redirect chain

curl -L -v https://example.com/old-url 2>&1 | grep "< HTTP\|< location"

Handling JSON Responses

# Pretty-print JSON (pipe to jq)

curl -s https://api.example.com/users | jq .

Extract specific field

curl -s https://api.example.com/users | jq '.[0].name'

Filter results

curl -s https://api.example.com/users | jq '.[] | select(.active == true)'

Count results

curl -s https://api.example.com/users | jq length

Timeout and Retry

# Set connection timeout (seconds)

curl --connect-timeout 10 https://api.example.com/users

Set maximum time for entire operation

curl --max-time 30 https://api.example.com/users

Retry on failure

curl --retry 3 --retry-delay 5 https://api.example.com/users

Cookies with GET Requests

# Send cookies

curl -b "session=abc123; theme=dark" https://example.com

Load cookies from file

curl -b cookies.txt https://example.com

Save cookies from response

curl -c cookies.txt https://example.com

Conditional GET Requests

# Only fetch if modified since a date

curl -H "If-Modified-Since: Mon, 01 Jan 2024 00:00:00 GMT" https://example.com

Only fetch if ETag has changed

curl -H 'If-None-Match: "etag-value"' https://api.example.com/data

Both return 304 Not Modified if content hasn't changed

Parallel GET Requests

# Sequential (slow)

curl https://api.example.com/page/1

curl https://api.example.com/page/2

curl https://api.example.com/page/3

Parallel with xargs

echo "1 2 3 4 5" | xargs -n1 -P5 -I{} curl -s -o "page_{}.json" "https://api.example.com/page/{}"

Parallel with GNU parallel

parallel curl -s -o "page_{}.json" "https://api.example.com/page/{}" ::: {1..10}

Python Equivalent

import requests

Basic GET

response = requests.get("https://api.example.com/users")

print(response.json())

With headers

headers = {

"Accept": "application/json",

"Authorization": "Bearer TOKEN"

}

response = requests.get("https://api.example.com/users", headers=headers)

With query parameters

params = {"page": 1, "limit": 10, "sort": "name"}

response = requests.get("https://api.example.com/users", params=params)

With proxy

proxies = {"https": "http://user:pass@proxy:8080"}

response = requests.get("https://api.example.com/users", proxies=proxies)

Common GET Request Patterns

Use CasecURL Command
Fetch JSON APIcurl -H "Accept: application/json" URL
Download filecurl -O URL
Check if URL workscurl -s -o /dev/null -w "%{http_code}" URL
View headerscurl -I URL
Follow redirectscurl -L URL
With authcurl -u user:pass URL
Through proxycurl -x proxy:port URL
Save cookiescurl -c cookies.txt URL
Pretty JSONcurl -s URL \jq .



FAQ

Do I need -X GET with cURL?

No. GET is the default method. Using -X GET is redundant but harmless. Only specify -X when using POST, PUT, PATCH, or DELETE.

How do I send GET request data in the body?

While technically possible with -d and -X GET, sending a body with GET requests is non-standard and many servers ignore it. Use query parameters instead.

Why does my GET request return HTML instead of JSON?

The server may default to HTML. Add -H "Accept: application/json" to request JSON format. Some APIs also use different URLs or extensions (.json) for different formats.

How do I handle pagination with cURL?

Use a bash loop: for i in $(seq 1 10); do curl -s "https://api.example.com/data?page=$i" >> results.json; done

Scroll to Top