San Diego Daily Weather Data

This post describes how to use San Diego Daily Weather Data to forecast air temperature through AI learning.
Table of Contents
- Data Introduction
- Data Visualization with Machbase Neo
- Table Creation and Data Upload in Machbase Neo
- Experimental Methodology
- Experiment Code
- Experimental Results
1. Data Introduction
- DataHub Serial Number: 2024-14.
- Data Name: San Diego Daily Weather Data.
- Data Collection Methods: The weather data, including temperature, pressure, and relative humidity, was collected at a weather station equipped with sensors in San Diego, California, over a 3-year period from September 2011 to September 2014, providing 1-minute interval data across various seasons and weather conditions.
- Data Source: Link
- Raw data size and format: 122MB, CSV.
- Number of tags: 11.
TAG | DESCRIPTION |
---|---|
air_pressure | air pressure measured at the timestamp (hectopascals) |
air_temp | air temperature measure at the timestamp (degrees Fahrenheit) |
avg_wind_direction | wind direction averaged over the minute before the timestamp (degrees, with 0 meaning coming from the North, and increasing clockwise) |
avg_wind_speed | wind speed averaged over the minute before the timestamp (meters per second) |
max_wind_direction | highest wind direction in the minute before the timestamp (degrees, with 0 being North and increasing clockwise) |
max_wind_speed | highest wind speed in the minute before the timestamp (meters per second) |
min_wind_direction | smallest wind direction in the minute before the timestamp (degrees, with 0 being North and increasing clockwise) |
min_wind_speed | smallest wind speed in the minute before the timestamp (meters per second) |
rain_accumulation | amount of accumulated rain measured at the timestamp (millimeters) |
rain_duration | length of time rain has fallen as measured at the timestamp (seconds) |
relative_humidity | relative humidity measured at the timestamp (percent) |
- Data Time Range: 2011-09-10 00:00:00 to 2014-09-10 23:59:00.
- Number of data records collected: 17,376,480.
- CSV data URL: https://data.yotahub.com/2024-14/datahub-2024-14-San-Diego-Daily-Weather.csv.gz
- Data Migration: San Diego Daily Weather Data Migration
2. Data Visualization with Machbase Neo
- Data visualization is possible through the Tag Analyzer in Machbase Neo.
- Select desired tag names and visualize them in various types of graphs.

- Below, access the 2024-14 DataHub in real-time, select the desired tag names from the data of 11 tags, visualize them, and preview the data patterns.
DataHub Viewer
3. Table Creation and Data Upload in Machbase Neo
- In the DataHub directory, use setup.wrk located in the San Diego Daily Weather Dataset folder to create tables and load data, as illustrated in the image below.

1) Table Creation
- The table is created immediately upon pressing the "Run" button in the menu.
- If the san_diego_weather table exists, execute the first line and then the second. If it does not exist, start from the second line.

2) Data Upload
- Loading tables in two different ways.
Method 1) Table loading method using TQL in Machbase Neo (since machbase-neo v8.0.29-rc1
-
Pros
- Markbase Neo loads as soon as you hit the launch button.
-
Cons
- Slower table loading speed compared to other method.

Method 2) Loading tables using commands
-
Pros
- Fast table loading speed.
-
Cons
- The table loading process is cumbersome.
- Run cmd window - Change machbase-neo path - Enter command in cmd window.
- If run the below script from the command shell, the data will be entered at high speed into the san_diego_weather table.
curl http://data.yotahub.com/2024-14/datahub-2024-14-San-Diego-Daily-Weather.csv.gz | machbase-neo shell import --input - --compress gzip --header --method append --timeformat ns san_diego_weather
- If specify a separate username and password, use the --user and --password options (if not sys/manager) and add the options as shown below.
curl http://data.yotahub.com/2024-14/datahub-2024-14-San-Diego-Daily-Weather.csv.gz | machbase-neo shell import --input - --compress gzip --header --method append --timeformat ns san_diego_weather --user USERNAME --password PASSWORD
4. Experimental Methodology
- Model Objective: Air Temperature Forecasting.
- Tags Used: air_temp tag.
- Model Configuration: DLinear.
- Goal: Forecasting the air temperature for the next 5 hours using data from the past 10 hours.
- Learning Method: supervised Learning.
- Train: Model Training.
- Test: Model Performance Evaluation Based on Air Temperature Forecasting.
- Model Optimizer: Adam.
- Model Loss Function: Mean Squared Error.
- Model Performance Metric: Mean Squared Error & R2 Score.
- Data Loading Method
- Loading the Entire Dataset.
- Loading the Batch Dataset.
- Data Preprocessing
- MinMax Scaling.
5. Experiment Code
- Below is the code for each of the two ways to get data from the database.
- If all the data can be loaded and trained at once without causing memory errors, then method 1 is the fastest and simplest.
- If the data is too large, causing memory errors, then the batch loading method proposed in method 2 is the most efficient.
Method 1) Loading the Entire Dataset
- The code below is implemented in a way that loads all the data needed for training from the database all at once.
- It is exactly the same as loading all CSV files (The only difference is that the data is loaded from Machbase Neo).
- Pros
- Can use the same code that was previously utilizing CSVs (Only the loading process is different).
- Cons
- Unable to train if trainable data size exceeds memory size.
- The entire code can be run through 14.San_Diego_Daily_weather_General.ipynb.
Method 2) Loading the Batch Dataset
- Method for loading data from the Machbase Neo for a single batch size.
- The code below is for fetching a time range sequentially for a single batch size.
- Pros
- It is possible to train the model regardless of the data size, no matter how large it is.
- Cons
- It takes longer to train compared to method 1.
- The entire code can be run through 14.San_Diego_Daily_weather_New_Batch.ipynb.
6. Experimental Results
Method 1) Loading the Entire Dataset Result


Method 2) Loading the Batch Dataset Result


- The R2 score for loading the entire dataset resulted in 0.93, loading the batch dataset resulted in same 0.92.
※ Various datasets and tutorial codes can be found in the GitHub repository below.
datahub/dataset/2024 at main · machbase/datahub
All Industrial IoT DataHub with data visualization and AI source - machbase/datahub