Istanbul residents recently had a scare when a magnitude 6 earthquake hit the nearby city of Duzce. When we felt the tremors, I wanted to confirm that it was actually an earthquake, and wanted to check where the epicenter was.

I went to the website of the national disaster management agency, AFAD, and found that it was overloaded and could not be accessed. I then went to the website of an earthquake observatory, Kandilli, and found that it was also overloaded and could not be accessed.

At the same time, we managed to get some information via Twitter, and later learned more details with various earthquake apps. In a future event, I don’t want to be searching for random websites and apps, so I decided to create a consolidated data source that I can quickly access. I’m sharing it here in case it’s useful to anyone else.

AFAD data

AFAD, the national disaster management agency, has a website that provides information about the latest earthquakes. It uses an HTTP endpoint that returns a JSON object with the latest earthquake data. Here’s how it works.

The endpoint is To get data out, you need to provide a filter. The filter is a JSON object that specifies the start and end dates of the data you want. Here’s an example.

import requests
import datetime
import json

URL = ""

end_date =
start_date = end_date - datetime.timedelta(days=1)

event_filter = {
    "EventSearchFilterList": [
        {"FilterType": 8, "Value": start_date.isoformat()},
        {"FilterType": 9, "Value": end_date.isoformat()},
    "Skip": 0,
    "Take": 100,
    "SortDescriptor": {"field": "eventDate", "dir": "desc"},

resp =, json=event_filter)
data = resp.json()

EMSC LastQuake app

The European Mediterranean Seismological Centre (EMSC) has an Android app called “LastQuake”. It has a backend that returns earthquake data in GeoJSON format. returns all earthquakes while returns only the ones that are considered “significant”.

Kandilli data

The Kandilli Observatory and Earthquake Research Institute also has a website, possibly the most popular page that everyone uses to check for earthquakes.

The URL is It returns an HTML page with the latest earthquake data. The data is formatted as plain text, and contained in a <pre> tag.

Another URL that seems to return the same data is

Kandilli mobile app

The same observatory also has a mobile app that provides earthquake data. The app is basically a webview that loads a mobile-friendly version of the website.

It can be accessed at

import requests
import re

URL = ""
html = requests.get(URL).text

# Remove HTML comments
html = re.sub(r"<!--.*?-->", "", html, flags=re.DOTALL)

pattern = "deprem_detay\('(.*?)','(.*?)','(.*?)','(.*?)','(.*?)','(.*?)','(.*?)'\);"
matches = re.findall(pattern, html)

for match in matches:

Earthquake Network application

I found that a lot of people were using an Android app called “Earthquake Network”. It seems to have a PHP backend that returns the earthquake data in JSON format. Here are some interesting endpoints.

There is an endpoint that returns all earthquakes greater than a given magnitude at

The mag parameter specifies the minimum magnitude, and the pro parameter specifies the organization that reported the earthquake. The value all means all organizations. To get the maximum number of earthquakes, you can set the magnitude to 0 and the organization to all.

Some values for the pro parameter are all, bdtim, csi, csn, emsc, funvisis, geonet, ign, ineter, ingv, inpres, jma, ncs, phivolcs, rsn, rspr, sgc, ssn, uasd and usgs.

Another interesting feature of the application is the live chat. It seems valuable to get updates from people in the area, so this endpoint is also interesting. The URL is

It returns a JSON list of chat messages. The endpoint takes two parameters, idmin and postfix. The idmin parameter specifies the minimum ID of the messages you want, and can be used to prevent downloading the same messages multiple times. The postfix parameter is like a room ID, and can be used to get messages from a specific region. _tr_gen is the general chat for Turkey.


I am planning to use these data sources to create a consolidated “status page”, like a personal dashboard that I can use to check for earthquakes. I hope others find this useful as well.

Please remember that people use these websites and apps during emergencies, so please don’t overload them. If you are going to scrape data, please be considerate and don’t make too many requests.

In fact, if you are making a user-facing application, put a cache in front of the data sources. This way, you can reduce the load on the data sources, and also provide a better user experience by reducing the latency of your application. If the websites I initially used had a cache, I would have been able to access them during the earthquake.