Write a Go function that fans out HTTP requests using goroutines and collects results via channels
go-jun-002
Your answer
Answer as you would in a real interview — explain your thinking, not just the conclusion.
Model answer
I spawn one goroutine per URL, each writing its result to a buffered channel sized to the number of URLs so goroutines never block. A result struct carries the URL, status code, and any error so the caller knows which request failed. After all goroutines launch, I drain the channel exactly len(urls) times. This is simpler than sync.WaitGroup for result collection because the channel itself serialises all responses into one place.
Code example
package main
import (
"fmt"
"net/http"
)
type Result struct {
URL string
Status int
Err error
}
func fetchAll(urls []string) []Result {
ch := make(chan Result, len(urls))
for _, url := range urls {
go func(u string) {
resp, err := http.Get(u)
if err != nil {
ch <- Result{URL: u, Err: err}
return
}
defer resp.Body.Close()
ch <- Result{URL: u, Status: resp.StatusCode}
}(url)
}
results := make([]Result, len(urls))
for i := range results {
results[i] = <-ch
}
return results
}
func main() {
urls := []string{"https://example.com", "https://go.dev"}
for _, r := range fetchAll(urls) {
fmt.Printf("%s -> %d %v\n", r.URL, r.Status, r.Err)
}
}
Follow-up
How would you add a timeout so the function returns after 5 seconds even if some requests are still in-flight?