Sunday, December 24, 2017

Read grb2 file in Shell, Python,R and Matlab

GRIB

GRIB, or GRIdded Binary, is a concise data format by the World Meteorological Organization to store weather data.
GRIB files are a collection of self-contained records of 2D data, and the individual records stand alone as meaningful data, with no references to other records or to an overall schema. So collections of GRIB records can be appended to each other or the records separated.
The latest standard is GRIB2.
Why GRIB2?
  • compression rate. File size is 2 or 3 times smaller than NetCDF file.
  • Models are getting bigger faster than the bandwidth increases.
  • Packing is transparent to the user: self-explanatory.
Problems with GRIB: No way in GRIB to describe a collection of GRIB records.
  • Each record is independent, with no way to reference the GRIB writer’s intended schema
  • No foolproof way to combine records into the multidimensional arrays from which they were derived.
Because of such problems, any Python/R/Matlab API or Panoply reading grb2 file will lose the metadata of the records and each record is comined in a random way. Consequently, multiple records are blended with each other and become useless. So the most reliable way is still use wgrb2 in shell or embedbed shell command in C++ to extract the exact records you need in your intended order.

shell: wgrib2 C library

port install will automatically install 24 dependecies including glib2,hdf5, jasper, jpeg, netcdf, openssl, proj, zlib.
Download macport from https://www.macports.org/ and install.
sudo port install wgrib2
wgrib2 gfs_4_20170830_1800_024.grb2  # print out all inv rows
wgrib2 gfs_4_20170830_1800_024.grb2 -s | grep ":PRES:surface" | wgrib2 -i gfs_4_20170830_1800_024.grb2 -text data1.txt  # output surface pressure
wgrib2 gfs_4_20170830_1800_024.grb2 -s | grep ":PRES:surface" | wgrib2 -i gfs_4_20170830_1800_024.grb2 -netcdf pres1.nc  #
wgrib trick: www.ftp.cpc.ncep.noaa.gov/wd51we/wgrib/tricks.wgrib

pygrib

This module is a python interface to the GRIB API C library from the European Centre for Medium-Range Weather Forecasts (ECMWF)
conda install -c conda-forge pygrib
import pygrib
# Library not loaded: @rpath/libpng16.16.dylib
# Reason: Incompatible library version: pygrib.cpython-35m-darwin.so requires version 51.0.0 or later, but libpng16.16.dylib provides version 49.0.0

pynio

python 3 is not compatible with pynio.
conda create --name=py2 python=2.7 anaconda
conda install -c conda-forge pynio
still incompatible
import Nio
# dlopen(/Users/jiang/anaconda3/envs/py2/lib/python2.7/site-packages/PyNIO/nio.so, 2): Library not loaded: @rpath/libgssapi_krb5.2.2.dylib
# Reason: image not found

GRIB-API

The ECMWF GRIB-API is an application program interface accessible from C, FORTRAN and Python programs developed for encoding
Please note that GRIB-API support is being discontinued at the end of 2018.

ecCodes

I couldn’t find a python library to call it.

R: gdal

The Geospatial Data Abstraction Library (GDAL) is a computer software library) for reading and writing raster and vector geospatial data formats,
install.packages("sp")
install.packages("rgdal")
folder = "/Users/jiang/data/goesr/20170831CONUS"
file = "gfs_4_20170830_1800_024.grb2"
setwd(folder)
grib <- readGDAL(file)  # Large SaptialGridDataFrame, 827 Mb
# however, this is not useful because the extracted records are not in the same order with the .inv file
extract grb2 file to nc file, unpack and plot
folder = "/Users/jiang/data/gfs"
file = "gfs_4_20170830_1800_024.grb2"
setwd(folder)
# system("wgrib2 -s gfs_4_20170830_1800_024.grb2 | grep :TMP:  | wgrib2 -i gfs_4_20170830_1800_024.grb2 -netcdf TMP.nc",intern=T) # run once or run in shell
library(ncdf4)
nc_version()  # "ncdf4_1.16_20170401"
temp <- nc_open("TMP.nc")
t2m.mean <- ncvar_get(temp,"TMP_2maboveground")
x <- ncvar_get(temp,"longitude")  #720
y <- ncvar_get(temp,"latitude")   #361
# dim(temp)  # 720,361
# c(nrow(temp),ncol(temp)) # 720, 361
library(maps)
library(fields)
day="DIA"
#png(filename="gfs_t2m.png",width=1215,height=607,bg="white")
rgb.palette <- colorRampPalette(c("snow1","snow2","snow3","seagreen","orange","firebrick"), space = "rgb")#colors
image.plot(x,y,t2m.mean,col=rgb.palette(200),main=as.expression(paste("GFS 24hr Average 2M Temperature",day,"00 UTC",sep="")),axes=T,legend.lab="o C")
require(maptools)
data(wrld_simpl)
plot(wrld_simpl, add = TRUE)

Matlab

It actually uses 2 free toolboxes:
  1. NCtoolbox
  2. M_MAP
clc
clear
mydate='20171224';
url = ['http://nomads.ncep.noaa.gov:9090/dods/wave/nww3/nww3', mydate,'/nww3',mydate,'_00z'];
nco=ncgeodataset(url);  % Instantiate the data set
sort(nco.variables);  % list variables
lon=nco{'lon'}(:);
lat=nco{'lat'}(:);
waveheight=nco{'htsgwsfc'}(1,:,:);% surface 
% significant height of wind waves
waveheight = squeeze(waveheight); % squeeze shape

% Plot the field using M_MAP.  Start with setting the map
% projection using the limits of the lat/lon data itself:
m_proj('miller','lat',[min(lat(:)) max(lat(:))],'lon',[min(lon(:)) max(lon(:))]);
% Next, plot the field using the M_MAP version of pcolor.
m_pcolor(lon,lat,waveheight);
shading flat;
% Add a coastline and axis values.
m_coast('patch',[.7 .7 .7]);
m_grid('box','fancy');
% Add a colorbar and title.
colorbar;
title('Example 1: WAVEWATCH III Significant Wave Height from NOMADS');

Others:

Monday, December 4, 2017

DVD, Core Meteorology

A series of DVD, director: Ron Meyer. Runtime: 30 minutes.
For grade 7 - College

Atmosphere

It is the atmosphere that creates relatively constant daytime versus nighttime temperatures, like the temperatures found on the earth. In our solar system, no atmosphere means no possibility of life.
We live on the earth. More precisely, we live in the troposphere of the earth, like fish in the water.
The stratosphere is home to the jet stream. Also home to Ozone.
The atmosphere is a delicate dynamic balance. A balance absolutely critical for all life on the planet.
heat conduction: the pan doesn’t move.
Remarkably, the most important part of this story for us is that the troposphere is heated from the bottom to the top. In spite of being the furthest from the sun, the air closest to the ground is warmer than the air 5 miles closer to the sun.
The heat move around the atmosphere is called weather. All weather happens in the lower atmosphere.

weather

3 factor affect weather prediction:
  1. initial state has some gap information
  2. model is not perfect
  3. the principle of time and size

climate

Climate is a pattern over time.
Human is adapted to these stable climate patterns as well.

NG- Natural disasters

1993 Storm of the Century

3 different weather models predict different storm track.
No matter how well we can predict this, we can’t stop it from occurring. If we had control of it, the meteorologist would make more money than professional sports players make.
I may lose a few toes, not a big deal… you are in good spirit.

2004 Indian Ocean earthquake and tsunami

Earthquake is actually good to release earth’s internal energy. A solution is to create a small man-made earthquake to defuse a big one.
Experimenting with mother nature is a very difficult thing. We may trigger a bigger earthquake than the one we were trying to prevent.
Inducing quakes is an inexact science.
A 30 billion question: how often does a city like New Orleans get hit by a Category Five hurricane? Experiments show New Orleans is living in its borrowing time. A major hurricane is overdue.
Project Stormfury is trying to intervene the hurricane, but failed.
The benefit of a hurricane is redistributing heat in the atmosphere.

Tornado intercept

Tim Samra died in 2013 in Oklahoma, the first known death as a storm chaser.
Tim Samaras did not seem startled by the question from his love child, Matt Winter. “Matt,” he replied, “Kathy’s a strong woman. She understands this is my passion. And if something happened to me, she’d move on.”

Thursday, November 23, 2017

Robot ND, A4, Control, Deep Learning


I started a new job as Data Analyst this month. My journey in Udacity Nanodegree will pause for a while. Overall, the course material is of high quality and worth devoting more time to digest. Although I didn’t end up at exactly what I initially imagined, yet my mind has been significantly broadened with these up-to-date content.
This blog is a track record of how I made my career change during the past 18 months. Thank you, Blogspot.
There are too many things to say about deep learning. It is a black box. An accuracy below 99% is not that useful. It relies on huge labeled data, expensive GPU computing power, and complicated algorithms. Anyway, it is still in development and a lot of fun to learn!

control

Control engineering often referred to as simple controls, is a multidisciplinary topic with roots in engineering in applied mathematics.
The primary objective is to design systems so that the response to an input is a predictable and desirable output.
Virtually every organ in biological process in the body uses some form of control: body temperature, blood pressure, glucose, PH and even skeletal muscle reflexes all rely on feedback control systems.
two types of control:
  1. open loop control. no attempt is made to measure if the output actually is the desired response. wash machine. In early days, much of the control theory was developed for the chemical and material processing industries. It is better for highly predictable, non-safety critical system.
  2. closed loop control. Have a sensor to get feedback signal to the controller
97% of all regulatory controllers are PID.
PID is also called 3-knob controller: proportional, integral, derivative.

deep learning

instructors:
  • Luis Serrano
  • Kelvin Lwin @ NVIDIA
fully convolutional networks:
  1. replace dense layer with 1*1 convolutional layer
  2. up-sampling through the use of transposed convolutional layers
  3. skip connection, use information from multiple resolution scales.
The Quiz uses tensorflow 0.12.1, as checked by tf.VERSION
object detection models that can draw bounding box: YOLO, SSD
semantic semmentation can acheive at pixel level.

segmentation lab

This ipynb contains the core codes for the final project “follow me”.
pip install tensorflow==1.2.1  # 1.1.0 doesn't work
pip install socketIO-client
pip install transforms3d
pip install PyQt5
pip install pyqtgraph
git clone https://github.com/udacity/RoboND-Segmentation-Lab.git
The train folder has 4132 images and corresponding masks, each image is shape (256, 256, 3)
The validation folder has 1185 images.
Note:
  • A lot of scaffold codes in the utils folder, including assigning images folder as X and mask folder as y. All the difficult details have been handled by utils codes. Don’t treat it as a magic button. Come back to dig more.
  • The images have been reduced to 128*128 to expedite the computing.
  • 3 classes are background, hero and other people.
A simple model looks like this:
def fcn_model(inputs, num_classes): 
    # Add Encoder Blocks, using separable convolution layers
    output1 = SeparableConv2DKeras (filters=3, kernel_size=3, strides=1, padding='same', activation='relu')(inputs)
    output1 = layers.BatchNormalization()(output1)    
    # Add 1x1 Convolution layer using conv2d_batchnorm().
    output2 = layers.Conv2D(filters=3, kernel_size=1, strides=1, padding='same', activation='relu')(output1)
    output2 = layers.BatchNormalization()(output2) 
    # Add the same number of Decoder Blocks as Encoder 
    upsampled = BilinearUpSampling2D((2,2))(output2)
    output3 = layers.concatenate([upsampled, inputs])
    return layers.Conv2D(num_classes, 3, activation='softmax', padding='same')(output3)

project

‘README.md’ contains a lot of detailed instructions.

AWS setting

  • EC2
  • launch instance -> Community AMIs -> search Udacity robotics -> p2.xlarge,next, next, next,
  • security group. create new, source: my ip. Note: If you plan to connect a Jupyter notebook to your AWS instance you will need to add one connection rule. Specifically, you will need to add a custom TCP rule to allow port 8888. and set the source to My IP.
After launch
  • connect, pull out an instruction, refresh memory at http://www.yuchao.us/2017/03/aws-elastic-compute-cloud.html
  • my key is stored at ~/.ssh
  • replace root in ssh -i "MyKeyPair.pem" root@ec2-34-236-144-24.compute-1.amazonaws.com to ubuntu. success.
  • connect AWS instance$ jupyter notebook --ip='*' --port=8888 --no-browser
  • open browse, enter {IPv4 Public IP}:8888
  • go back to terminal for token/password
  • if quit, type “exit” in terminal
  • if stop instance temporarily, click actions -> instance state -> stop

simulator control

L: Turns the legend with the control information on and off
H: Enables and disables local control of the quad
WSAD/Arrows: Moves the quad around when local control is on
E/Q: Rotate the quad clockwise and counterclockwise respectively
Space/C: Increase and decrease the thrust of the quad when local control is on
Scroll wheel: Zooms the camera in and out
Right mouse button (drag): Rotates the camera around the quad
Right mouse button (click): Resets the camera
Middle mouse button: Toggle patrol and follow mode
G: Reset the quad orientation
F5: Cycle quality settings
/: Path display on/off
ESC: Exit to the main menu
Crtl-Q: Quit

execute the code

ros
cd RoboND-DeepLearning-Project/code
source activate RoboND
python preprocess_ims.py  # image preprocessing
python follower.py my_amazing_model.h5 # test in simulator

What to include in your submission

Achieve an accuracy of 40% (0.40) using the Intersection over Union IoU metric which is final_grade_score at the bottom of your notebook.
Use the Project Rubric to review the project. You must submit your project as a zip file. The submission must included:
  1. Your model_training.ipynb notebook that you have filled out.
  2. A HTML version of your model_training.ipynb notebook.
  3. Writeup report (md or pdf file) summarizing why you made the choices you did in building the network.
  4. Your model and weights file in the .h5 file format

first submission review

The project code is basically not started and report only consists of a few lines, so the project cannot be reviewed. I think you would be better served discussing with a mentor on Live Chat rather than making a submission here.
As for how to improve your model, there are two main things:
1) Need to add more layers. At the moment there is just 1 encoder, 1 1x1 convolution and 1 decoder. You ideally want 2-3 encoders and 2-3 decoders.
2) Need to add more filters to each layer. At the moment just using 3 for each layer, but should be higher than this. For example 32, 64, 128, etc.
Again I highly recommend having a discussion with a Live Chat mentor before your next submission. We can help you out, don’t worry! :udacious:

Tuesday, October 24, 2017

book, Lies my teacher told me

Note: I only finished reading the first 2 chapters. There are so many historical details that my brain is not ready to store. The key idea is that the winner tells the story to his favor. All the living human should be aware of that the history is not only about the glory heroine past, but also about the struggle and the brutal evolution. I read the 1st version. The 2nd version has included a few more textbooks and latest research.
lies my teacher told me, 1996, 2007
By James W.Loewen, whose Ph.D. in sociology from Harvard University is based on his research on Chinese Americans in Mississippi.
The book reflects Loewen’s belief that history should not be taught as straightforward facts and dates to memorize, but rather as analysis of the context and root causes of events. Loewen recommends that teachers use two or more textbooks, so that students may realize the contradictions and ask questions, such as, “Why do the authors present the material like this?”
Because textbooks employ such a godlike tone, it never occurs to most students to question them. “In retrospect, I ask myself, why didn’t I think to ask, e.g. who were the original inhabitants of the Americas, what was their life like, and how did it change when Columbus arrived. However, back then everything was presented as if it were the full picture, so I never thought to doubt that it was.”
sale figures are trade secrets.

1 handicapped by history: the process of Hero-making

Charles V. Willie
By idolizing those whom we honor, we do a disservice both to them and to ourselves.. we fail to recognize that we could go and do likewise.
The hidden history of Helen Keller advocate socialism and President Woodrow Wilson invaded South America.
Keller learned how the social class system controls people’s opportunities in life, sometimes determining even whether they can see.
I had once believed that we were all masters of our fate— that we could mold our lives into any form we pleased… But as I went more and more about the country I learned that I had spoken with assurance on a subject I knew little about. I forgot that I owed my success partly to the advantages of my birth and environment…
There are 3 great taboos in the textbook publishing: sex, religion, and social class. The notion that opportunity might be unequal in America is disliked by many textbook authors and teachers. Educators would much rather present Keller as a boring source of encouragement and inspiration to our young — if she can do it, you can do it!
A host of other reasons may help explain why textbooks omit troublesome facts:
  • pressure from the ruling class
  • pressure from textbook adoption committees
  • the wish to avoid ambiguities
  • a desire to shield children from harm or conflict
  • the perceived need to control children and avoid classroom disharmony
  • pressure to provide answers
We don’t want complicated icons. We seem to feel that a person like Helen Keller can be an inspiration only as long as she remains uncontroversial, one-dimensional.
Conclusions are not always pleasant. Most of us automatically shy away from conflict. We particularly seek to avoid conflict in the classroom.

1493

textbooks don’t tell:
  • advances in military technology.
  • social technology: bureaucracy, double-entry bookkeeping, mechanical printing
  • ideological: collect wealth and dominate other people is positively valued as the key means of winning esteem. Pursuit of wealth as a motive for coming to American. Authors believe that to have America explored and colonized for economic gain is somehow undignified.
  • readiness to embrace a new continent is the particular nature of European Christianity. evangelization
  • Europe’s recent success in taking over and exploiting various island societies.
Deep down, our culture encourages us to imagine that we are richer and more powerful because we’re smarter. We are smarter so “it’s natural” for one group to dominate another.
Most important, his purpose from the beginning was not mere exploration or even trade, but conquest and exploitation, for which he used religion as a rationale.
When Columbus was selling Queen Isabella on the wonders of the Americas, the Indians were well built and of quick intelligence. They have very good customs, and the king maintains a very marvelous state, of a style so orderly that it is a pleasure to see it, and they have good memories and they wish to see everything and ask what it is and for what it is used. Later, when Columbus was justifying his wars and his enslavement of the Indians, they became cruel and stupid, a people warlike and numerous, whose customs and religions are very different from us.
It is always useful to think badly about people one has exploited or plans to exploit. Modifying one’s opinions to bring them into line with one’s actions or planned actions is the most common outcome of the process known as cognitive dissonance. No one likes to think of himself or herself as a bad person. We cannot erase what we have done, and to alter our future behavior may not be in our interest. To change our attitude is easier.

the truth about the 1st Thanksgiving

Humans evolved in tropical regions. People moved to cooler climates only with the aid of cultural inventions: clothing, shelter, and fire.
William McNeill reckons the population in 1492: Americans (100 M), Europe(70M). It is the plague that help European settlers dominate the population over the centuries.
In 1970, Wamsutta Frank James went to Plymouth and declared Thanksgiving day a National Day of Mourning for Native Americans.
The true history of Thanksgiving reveals embarrassing facts. The Pilgrims did not introduce the tradition; Eastern Indians had observed annual harvest celebration for centuries. Our modern celebrations date back only to 1863 during the civil war when the Union needed all the patriotism.
The antidote to feel-good history is not feel-bad history, but honest and inclusive history. If textbook authors feel compelled to give moral instruction, they could accomplish this aim by allowing student to learn both the good and the bad sides of the Pilgrim tale. The conflict would then become part of the story, and students might discover that the knowledge they gain has implications for their lives today.