Sunday, 1 December 2019

Secure communications for the SupermarketPlanner


The SupermarketPlanner is running a Raspberry Pi Rest Server. This allows me to create a list on the Windows Client and Sync over the Internet to either another Windows Client or an Android Phone App.
As I can potentially connect these over the Internet, I wanted to setup a secure connection between my clients and server. The obvious solution to this was to setup SSL.

With SSL, a private and public key is created. The public key is distributed in the form of a certificate. This can be used to encrypt a message that can only be decrypted with the private key.
When a client connects to a server this information can be used to setup a secure connection via a hand-shake mechanism, A good explanation can be found here.

For my purposes, since I am running both client and server, I decided on setting up a self-signed certificate. I used OpenSSL on my Raspberry Pi to create this, following the steps below:

Create a Private key for the Certificate Authority
openssl genrsa -des3 -out myCA.key 2048
Create the Certificate Authority cert
openssl req -x509 -new -nodes -key myCA.key -sha256 -days 3650 -out myCA.pem
Create Private key for Certificate
openssl req -new -key jsco-cert.key -out jsco.csr
Create certificate signing request
openssl req -new -key jsco-cert.key -out jsco.csr
Create new certificate, sign against Certificate Authority
openssl x509 -req -in jsco.csr -CA myCA.pem -CAkey myCA.key -CAcreateserial -out jsco-cert.crt -days 365 -sha256 -extfile config.txt
The DNS name for the server needs to be added to the Subject Alternate Name (SAN), not the CommonName (CN). The CN can be defined in the interactive request but SAN can only be defined in a separate configuration file, example below:

authorityKeyIdentifier=keyid,issuer
basicConstraints=CA:FALSE
keyUsage = digitalSignature, nonRepudiation, keyEncipherment, dataEncipherment
subjectAltName = @alt_names
[alt_names]
DNS.1 = jsco.hopto.org
view raw config.txt hosted with ❤ by GitHub
Self-signed certificates are not signed by a well known Certificate Authority, browsers will pop up a warning. To overcome this on my machines I added my own CA certificate to the Authorities on Chrome and Edge/IE11
  • Chrome
    •  Settings -> Advanced -> Privacy and Security -> Manage Certificates -> Authorities tab -> Import CA certificate
  • Edge / IE11
    •  Open IE11
    •  Internet Options -> Content -> Certificates
    •  Select Trusted Root Certification Authorities
    •  Click Import and choose the CA certificate to upload
Enabling SSL on the web server is simple, using web.py I just needed to add the following (This for version 0.40):

from cheroot.server import HTTPServer
from cheroot.ssl.builtin import BuiltinSSLAdapter
HTTPServer.ssl_adapter = BuiltinSSLAdapter(
certificate = './Certificates/jsco-cert.crt',
private_key = './Certificates/jsco-cert.key')
As for my client applications, after adding my CA to IE11,  SupermarketPlanner, being a .NET Core app running on Windows worked with no changes other than to use the https url. This worked on both Windows 10 and Windows 8.1 boxes.
using (var client = new HttpClient())
{
var postData = new KeyValuePair<string, string>[]
{
new KeyValuePair<string, string>("A Key", "Some Payload")
};
var content = new FormUrlEncodedContent(postData);
// _serverUrl can be https://x.y.z:port
var response = await client.PostAsync(_serverUrl, content);
// Do something with response
}

For my Android App this was a bit more complicated. When simply changing the URL I got this Exception:

CertPathValidatorException : Trust Anchor for Certification Path not found

The documentation here: https://developer.android.com/training/articles/security-ssl was what I needed for this problem. This allows me to add my own CA to the trust chain.

I added my CA certificate to the assets/ folder in the Android project. An assets folder is a location for files that are copied as-is to the .apk file. This can then be referenced and navigated as a normal directory using the AssetManager

protected String getData(String... params)
{
// Removed params checks
try
{
if (!params[0].isEmpty())
{
m_restUrl += "?date=" + params[0];
}
URL url = new URL(m_restUrl);
SSLContext sslContext = getSSLContext();
HttpsURLConnection urlConnection = (HttpsURLConnection)url.openConnection();
urlConnection.setSSLSocketFactory(sslContext.getSocketFactory());
InputStream inputStream = new BufferedInputStream(urlConnection.getInputStream());
try (ByteArrayOutputStream result = new ByteArrayOutputStream())
{
byte[] buffer = new byte[1024];
int length;
while ((length = inputStream.read(buffer)) != -1)
{
result.write(buffer, 0, length);
}
output = result.toString("UTF-8");
} finally
{
urlConnection.disconnect();
}
} catch (Exception ex)
{
output = "ERROR: " + ex.getMessage();
}
return output;
}
/**
* If we aren't using a public CA for the SSL connection we can trust the self-signed CA
* @return SSLContext that includes self-signed CA
*/
private SSLContext getSSLContext()
{
try {
CertificateFactory cf = CertificateFactory.getInstance("X.509");
// Load the CA. I've included in the Assets folder
AssetManager assetManager = m_context.getAssets();
InputStream caInput = assetManager.open("myCA.pem");
Certificate ca;
try
{
ca = cf.generateCertificate(caInput);
}
finally
{
caInput.close();
}
// Create a KeyStore containing our trusted CA
String keyStoreType = KeyStore.getDefaultType();
KeyStore keyStore = KeyStore.getInstance(keyStoreType);
keyStore.load(null, null);
keyStore.setCertificateEntry("ca", ca);
// Create a TrustManager that trusts the CAs in our KeyStore
String tmfAlgorithm = TrustManagerFactory.getDefaultAlgorithm();
TrustManagerFactory tmf = TrustManagerFactory.getInstance(tmfAlgorithm);
tmf.init(keyStore);
// Create an SSLContext that uses our TrustManager
SSLContext context = SSLContext.getInstance("TLS");
context.init(null, tmf.getTrustManagers(), null);
return context;
}
catch(Exception ex)
{
return null;
}
}
view raw RestClient.java hosted with ❤ by GitHub

Thursday, 31 October 2019

Migrate Supermarket Planner to .NET Core

The SupermarketPlanner project is quite a few years old now. So given that .NET Core v3.0 has been released I thought it would be an interesting exercise to upgrade it.

My starting point for this was this Microsoft blog. (After installing .NET Core 3.0 and Visual Studio 2019.)

Portability Analyzer

The portability analyzer is a tool that reports how ready your Windows Forms or WPF application is for converting to .NET Core.

As it turns out there are two ways of getting this to run. First it can be built from this GitHub repository: https://github.com/microsoft/dotnet-apiport 

There were some points to be aware of when building this tool:
  • It needed to be build with Visual Studio 2017
  • Visual Studio Extensions support needed to be installed via the Extension manager to build a couple of the projects
  • The init.ps1 powershell script needed to be run first
The last step required Powershell to be run as admin. Also, as the script was unsigned, this command needed to be run first

    set-ExecutionPolicy RemoteSigned

The build created a .NET and .NET Core binary. Running it by default generated an Excel sheet which output of this was pretty clean, everything was supported by at least

".NET Core + Platform Extensions"

(The second way to install was using the Extension Manger in Visual Studio 2019 and load from the store. When this has been done, you can right click on the solution in Solution Explorer and select "Analyze assembly portability")

The analyzer did report problems with the unit test project. These types are no longer supported:
  • Microsoft.VisualStudio.TestTools.UnitTesting.Assert
  • Microsoft.VisualStudio.TestTools.UnitTesting.TestClassAttribute
  • Microsoft.VisualStudio.TestTools.UnitTesting.TestMethodAttribute
This simplest approach was to exclude test project for the time being.

Change the project to SDK-Style

First step is to change from the old style .csproj format to the new SDK-style.
Following the steps in the blog, the new .csproj file looked like this. Much cleaner!

<project sdk="Microsoft.NET.Sdk.WindowsDesktop">
<propertygroup>
<outputtype>WinExe</outputtype>
<targetframework>net461</targetframework>
<usewpf>true</usewpf>
<generateassemblyinfo>false</generateassemblyinfo>
</propertygroup>
</project>
At this point it failed to build due to a missing reference of System.Net.Http
There was also a problem with System.Windows.Forms. These needed to be added as references

After a successful build it was now time to change to .NET Core 3! This is really simple, just change this line:

<TargetFramework>netcoreapp3.0</TargetFramework>

The project was successfully updated in Visual Studio and a new attempt made to build. This failed with a problem with System.Drawing.Print types. To be fair, this was in the analyzer report.

The fix was to add a reference to System.Drawing.Common v4.6 using NuGet.

There was one outstanding problem with print preview:

var printPreviewDialog1 = new System.Windows.Forms.PrintPreviewDialog();

The pragmatic approach was to remove this code. Supporting print preview will be a future task.

This cleared the build issues but unfortunately when starting the app I got a runtime error caused by missing images/settings.ico.

The reason for this is that icons weren't automatically added as references in the new project. manually doing this fixed the problem.

And the test project I excluded earlier? The easy way for this was to create a new unit test project and copy the tests over, referencing NUnit.Framework this time.

Azure build pipeline

I've setup an Azure build pipeline for this project but unfortunately the first build failed with this error:

##[error]ClientApp\SuperMarketPlanner.csproj(0,0):
Error : C:\Program Files\dotnet\sdk\2.2.109\Sdks\Microsoft.NET.Sdk.WindowsDesktop\Sdk not found.
Check that a recent enough .NET Core SDK is installed and/or increase the version specified in global.json.


I updated the yml file to load the 3.0.100 Sdk, But then got a failure with the NuGetCommand:

Errors in D:\a\1\s\SuperMarketPannerUnitTests\SuperMarketPlannerUnitTests.csproj
    NU1102: Unable to find package Microsoft.NETCore.App with version (>= 3.0.0)
      - Found 76 version(s) in NuGetOrg [ Nearest version: 3.0.0-preview8-28405-07 ]


For now, I've got round this by explicitly defining the latest official version for the package reference in the unit test project

<PackageReference Include="Microsoft.NETCore.App" Version="2.2.7" />

Finally, I also changed the build and test tasks in the yaml to use the DotNetCoreCLI tasks. The pipline build is now green! The updated yaml is here:
trigger:
- master
pool:
vmImage: 'VS2017-Win2016'
variables:
solution: '**/*.sln'
buildPlatform: 'Any CPU'
buildConfiguration: 'Release'
steps:
- task: UseDotNet@2
displayName: 'Install .NET Core SDK'
inputs:
version: 3.0.x
performMultiLevelLookup: true
- task: NuGetToolInstaller@0
- task: NuGetCommand@2
inputs:
restoreSolution: '$(solution)'
- task: DotNetCoreCLI@2
inputs:
command: 'build'
- task: DotNetCoreCLI@2
inputs:
command: test
projects: '**/*Tests/*.csproj'
arguments: '--configuration $(buildConfiguration)'

Friday, 4 October 2019

Docker Debug

When trying to start a new docker container I was getting an error from python about a missing module:

pi@raspberrypi:~/ServerSSL $ python RestServerSSL.py
Traceback (most recent call last):
  File "RestServerSSL.py", line 9, in <module>
    from web.wsgiserver import CherryPyWSGIServer
ImportError: No module named wsgiserver 

I had updated the python code to incorporate SSL and suspected I had a different version of web.py on the image to that I was using locally.

I therefore wanted to understand what exactly was installed on the image.

The command below allowed me to startup a container with an interactive bash shell:

docker run -it --entrypoint /bin/bash <ImageName> -s

Then I could run:

pip list 

to list out all the versions of python packages.

It turned out that web.py was at version 0.40 on the image, and 0.38 on my laptop. I upgraded the version on the laptop with:

pip install --upgrade web.py

Then I made the necessary changes (see here for more) and rebuilt the image.
 




Tuesday, 9 July 2019

Docker Volume Control


The Raspberry Pi web.py rest service currently saves files to the folder /home/pi/data on the Docker container. This however is only available for the lifetime of the container. In practice this doesn't matter too much because the service is very stable but when I run updates for example I'd like the current list to persist.

Quick aside: Stopping a running container
(useful if I've left it running for 6 months)

First list the containers with docker ps

pi@raspberrypi:~/Server $ docker ps
CONTAINER ID        IMAGE          ...     NAMES
65cb45e19d62        rest-server    ...     loving_mirzakhani 
 
We are interested in the Names column. In this case run  
docker stop loving_mirzahkani

As per the documentation, volumes are the preferred way to persist data used by containers. So to set this up for the rest server I took the following steps.

Comment out the line to populate the image folder. We don't need to copy anything now so we can remove this line:

COPY data /home/pi/data/

(The data folder originally container start-up data for the webservice)

Then it's simply a case of rebuilding the image and running with this command, using the --mount syntax

docker run -d -p 8080:8080 --mount source=smktdata,target=/home/pi/data rest-server

The first time this was run it automatically created a volume called smktdata

The volume details can be shown with the command

docker volume inspect smktdata

pi@raspberrypi:~/Server $ docker volume inspect smktdata
[
    {
        "CreatedAt": "2019-07-05T18:14:50Z",
        "Driver": "local",
        "Labels": null,
        "Mountpoint": "/var/lib/docker/volumes/smktdata/_data",
        "Name": "smktdata",
        "Options": null,
        "Scope": "local"
    }
] 
  
The Mountpoint is the location of the actual data. Files can be viewed with
sudo ls /var/lib/docker/volumns/smktdata/_data

I also have a NAS which is mounted automatically at startup. To use a folder on the NAS, I used this start-up command

docker run -d -p 8080:8080 -v /home/pi/NAS/data:/home/pi/data rest-server

Note that in this case I used the -v (--volume) syntax, this is equivalent to --mount. It essentially just combines all the options into a single field

Tuesday, 18 June 2019

Windows Screensaver Update


My Windows "Random" photos screensaver code hasn't really been significantly touched for many years now, but as it's working really well on a daily basis on my home PCs, I thought I'd add it to GitHub.

You can use this URL to clone: https://github.com/jscott7/PhotosScreensaver.git
(A wiki has also been started: https://github.com/jscott7/PhotosScreensaver/wiki)

As part of the process I performed a bit of a clean up. It's OK, this is a very small project!

First, the settings have an option to change the delay between photos updating, unfortunately while you could change this it wasn't being used.

The settings for the root folder for photos and delay are saved in the registry, I'm using the key HKCU\SOFTWARE\JscoPhotoScreenSaver

public static void SaveSetting(string name, object value)
{
   var key = Registry.CurrentUser.CreateSubKey("SOFTWARE\\JscoPhotoScreenSaver");

   if (value != null)
   {
       key.SetValue(name, value);
   }
}


Then save the photos path to this key with:

SettingsUtilities.SaveSetting("photopath", filePathBox.Text);

To load back use:

object rootPath = SettingsUtilities.LoadSetting("photopath");

There was an inefficiency with the loading of photo files. I was duplicating this for each window. Not a problem for single monitors but it's unnecessary for multiple monitor setups. Fortunately this was easy to fix by moving the logic to the App.xaml.cs file.

Next, to install, the exe built by the project needs to be renamed to PhotosScreensaver.scr. You then right-click on it and select Install. I added a post-build step to the project to automatically perform this rename.

I also made a few tweaks to follow the MS Naming guidelines

Finally, I made use of string interpolation which is new from C# 6. Previously I used a StringBuilder to generate a debug log. This was:

log.Append("Show").Append(window.Width).Append("-").Append(window.Height).Append("-").Append(window.Left).Append("-").AppendLine(window.Top.ToString()); 

And now, with string interpolation it's much cleaner:

log.Append($"Show {window.Width}-{window.Height}-{window.Left}-{window.Top}");

Friday, 7 June 2019

Raspberry Pi Dashboard with Dakboard

A while ago I bought a cheap Raspberry Pi touchscreen. This like so many impulse buys spent many months gathering dust in a cupboard but given the progress of my webserver, I thought I'd try setting it up as a display.

I was inspired by the excellent Scott Handelman blog to setup a dashboard  using Dakboard and my Raspberry Pi display.

Back when I first got the display, I'd set the screen to portrait mode, but with the stand I have I needed to reset it back to landscape.  It took a while to remember how to do this.

First in  /boot/config.txt I'd added a line display_rotate=1 which rotates by 90 degrees. I removed this and rebooted.

Unfortunately, the response to the touch screen was still in portrait mode.

To change this I had to edit /etc/X11/xorg.conf.d/99-calibration.conf

And change the SwapAxes option back from "1" to "0".

Section "InputClass"
        Identifier      "calibration"
        MatchProduct    "ADS7846 Touchscreen"
        Option  "Calibration"   "145 3995 290 3945"
        Option "SwapAxes"    "0"
EndSection

(I also needed to recalibrate, this is available from Preferences -> Calibrate TouchScreen on my device)

I then created a Dakboard account and setup a simple view; Date, RSS feed and local weather. You are then given a private URL which can be used to display this on any device.

In order to startup the Raspberry Pi displaying this page full screen with no mouse pointre,  I followed the instructions on Scott Hanselman's blog.

Edit (take a copy first):  ~/.config/lxsession/LXDE-pi/autostart

Replace contents with:

@xset s off
@xset -dpms
@xset s noblank
@chromium-browser --noerrdialogs --incognito --kiosk https://dakboard.com/app?p=
private-url

(There are also good instructions here)

Unfortunately, while it does look great, Dakboard doesn't quite work for me.
The ability to use a my own webservice as data input is only supported on the paid-for Premium plan. At $5.95 per month this is too much for me so I'm going to look into creating my own webpage. After all, isn't creating something all the fun!

As an aside. With a 750mA power supply the Pi shows the lightening bolt in the top right hand corner, which indicates it's underpowered. Not surprising given we're also powering the screen.
 

I have an 850mA supply, which is slightly better but I would need more if I wanted to avoid the computer crashing.

Monday, 3 June 2019

Loading old Physics Data Files - Part 2

Previously, my attempt at running a FORTRAN program to load my old ZEUS data ntuples failed because CERNLIB isn't compatible with a 64-bit Linux OS:

 Test loading ntuple files
!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
LOCB/LOCF: address 0x562cca6e2c80 exceeds the 32 bit address space
or is not in the data segments
This may result in program crash or incorrect results
Therefore we will stop here
!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! 
 
Unfortunately, the recommendation in  /usr/share/doc/libpacklib1-dev/README.64-bit to statically link didn't work for me:

Static linking is the default behavior if you use the "cernlib" script when
linking, like this:

 # Should work everywhere:
 gfortran -o myprogram myprogram.F `cernlib -G Motif pawlib` 
 
Fortunately, Linux Mint still provides a 32-bit version of the latest OS. So, at this point rather than trying to fix it in 64-bit I thought the best approach would be to setup a Virtual Machine.

For the VM host, I decided to use VirtualBox. This is easy to install on Ubuntu from the Software Center.

After downloading the 32-bit Linux Mint ISO I setup a new VM with 4MB of Memory and Dynamically Allocated Storage with 70GB Virtual Size.

After installing Mint, I reinstalled the CERNLIB packages from the Mint Software Manager and loaded up the ntuple in PAW as per my last post

Now, given this was working I could try and load the file using my FORTRAN program:

      program main

      implicit none
      integer istat, NEvents, idnt
      real hmem
      common/pawc/hmem(2000000)

      print *, "Test loading ntuple files"
      call hlimit(2000000)  
     
      call hropen(80,'ntuple','mc97_1.rz','',4096,istat)
 
      if (istat.ne.0) then
          print *, "Failed to open input file"
         stop
      endif

      print *, "Loaded mc96_1.rz"
      call hrin(10, 9999999, 0) 
      call hnoent(10,NEvents)
      print *, NEvents
c      print *, idnt    

      call hrend('ntuple')
      close(80)  

      end program main

And this time, success!

 Test loading ntuple files
 Loaded mc96_1.rz
       51781

There's a few points to bear in mind:
  • common/pawc/hmem(2000000) is required to reserve locations to a common /PAWC/, for the HBOOK working space (an array)
  • call hlimit(2000000) informs HBOOK of the storage limit
  • call hropen opens the direct access rz file
  • call hrin(10, 9999999, 0) reads a histogram from the current directory of the direct access file into the current directory in memory. The 9999999 to to read the highest cycle
  • call hnoent Gets the number of events in the in-memory identifier

Monday, 20 May 2019

Loading old Physics Data Files - Part 1

It's been a while since I built the CERNLIB libraries on my laptop. Since then the old laptop has finally failed and I've installed CERNLIB from the Ubuntu repositories, so I thought it was time to try and load my old data files again.

The data are saved as binary ntuples packaged as .rz files. So, to begin I tested opening them in PAW (Physics Analysis Workstation).

To load, simply run paw, take the default workstation type and enter:
hi/file 1 mc96_2.rz

jonathan@jonathan-Inspiron-5759:~/Development/Physics$ paw
 ******************************************************
 *                                                    *
 *            W E L C O M E    to   P A W             *
 *                                                    *
 *       Version 2.14/04      12 January 2004         *
 *                                                    *
 ******************************************************
 Workstation type (?=HELP) <CR>=1 : 
 Version 1.29/04 of HIGZ started
PAW > hi/file 1 mc96_2.rz 4096
PAW > 

The arguments are:
  • '1' is the 'logical unit' of the file
  • The filename, 'mc96_2.rz'
  • Record length in words (I saved the data back in the day with 4096) It's important this is included or a segmentation error is returned:
PAW > hi/file 1 mc96_2.rz

 *** Break *** Segmentation violation
 Traceq lun = 0, level = 99 

 TRACEQ.  In-line trace-back still not available.
 Longjump 
PAW > 

The structure of the ntuple currently loaded in memory can be seen with nt/print 10
(10 is the ntuple identifier. Again, I originally saved it with this value)

PAW > nt/print 10


 ******************************************************************
 * Ntuple ID = 10     Entries = 5882      ntuple
 ******************************************************************
 * Var numb * Type * Packing *    Range     *  Block   *  Name    *
 ******************************************************************
 *      1   * R*4  *         *              * FLT      * TrigDat(15)
 *      1   * I*4  *         *              * TLT      * TLT(15)
 *      1   * R*4  *         *              * TRK      * VCT_XVC
 *      2   * R*4  *         *              * TRK      * VCT_YVC
 *      3   * R*4  *         *              * TRK      * VCT_ZVC
 *      4   * I*4  *         *              * TRK      * NVTRKC
 *      5   * I*4  *         *              * TRK      * NTRKC
 *      6   * R*4  *         *              * TRK      * CHVCC 
...
 ******************************************************************
 *  Block   *  Entries  * Unpacked * Packed *   Packing Factor    *
 ******************************************************************
 * FLT      *  5882     * 60       * 60     *       1.000         *
 * TLT      *  5882     * 60       * 60     *       1.000         *
 * TRK      *  5882     * 40       * 40     *       1.000         *
 * CAL      *  5882     * 60       * 60     *       1.000         *
 * ELEC     *  5882     * 144      * 144    *       1.000         *
 * ZUFOS1   *  5882     * 16       * 16     *       1.000         *
 * ZUFOS2   *  5882     * 4        * 4      *       1.000         *
 * ZUFOS3   *  5882     * 12       * 12     *       1.000         *
 * ZUFOS4   *  5882     * 96       * 96     *       1.000         *
 * ZUFOS5   *  5882     * 16       * 16     *       1.000         *
 * TEMP     *  5882     * 8        * 8      *       1.000         *
 * TAG1     *  5882     * 4        * 4      *       1.000         *
 * TAG2     *  5882     * 4        * 4      *       1.000         *
 * LUMI1    *  5882     * 12       * 12     *       1.000         *
 * LUMI2    *  5882     * 12       * 12     *       1.000         *
 * GEN      *  5882     * 32       * 29     *       1.103         *
 * MCTRUE   *  5882     * 88       * 88     *       1.000         *
 * BGDTUP   *  5882     * 28       * 28     *       1.000         *
 * Total    *    ---    * 696      * 693    *       1.004         *
 ******************************************************************
 * Blocks = 18           Variables = 126          Columns = 174   *
 ******************************************************************
  
And to page through the data use nt/scan 10

PAW > nt/scan 10
/NTUPLE/SCAN: Only showing first 30 of expressions
+-------+--------------+-------------+--------------+--------------+--------------+-------------+-------------+--------------+-----
| Event |   TrigDat    |   TLT       |   VCT_XVC    |   VCT_YVC    |   VCT_ZVC    |   NVTRKC    |   NTRKC     |   CHVCC      |   FC
+-------+--------------+-------------+--------------+--------------+--------------+-------------+-------------+--------------+-----
|     1 |              |             | -1.43087     |  1.24444     |  18.4673     |  2          |  3          |  3.27624     |  0. 
| *   1 |  0.          |  0          |              |              |              |             |             |              |     
| *   2 |  0.          |  0          |              |              |              |             |             |              |     
| *   3 |  0.          |  0          |              |              |              |             |             |              |     

I've got 174 columns in this particular ntuple. To restrict the selection use:
nt/scan 10 varlis=[Comma separated list of columns]

PAW > nt/scan 10 varlis=ENE44M:ZufoPz
+-------+-------------+--------------+--------------+--------------+
| Event |   ENE44M    |   TEMPLUME   |   TEMPLUMG   |   ZufoPz     |
+-------+-------------+--------------+--------------+--------------+
|     1 |  1000       | -500.        | -500.        |  2.55895     |
|     2 |  1000       | -500.        | -500.        |  11.4667     |
|     3 |  1000       | -500.        | -500.        | -1.63046     |
|     4 |  1000       | -500.        | -500.        |  1.37581     |
|     5 |  1000       | -500.        | -500.        | -2.54238     |
|     6 |  1000       | -500.        | -500.        | -1.91473     |
|     7 |  1000       | -500.        | -500.        |  1.51911     |
|     8 |  1000       | -500.        | -500.        |  0.          |
|     9 |  1000       | -500.        | -500.        |  12.3712     |
|    10 |  1000       | -500.        | -500.        | -4.4222      |
|    11 |  1000       | -500.        | -500.        |  0.0533053   |
|    12 |  1000       | -500.        | -500.        | -0.0746093   |
|    13 |  1000       | -500.        | -500.        |  33.6832     |
|    14 |  1000       | -500.        | -500.        |  0.223857    |
|    15 |  1000       | -500.        | -500.        |  8.72115     |
|    16 |  1000       | -500.        | -500.        |  10.5479     |
|    17 |  1000       | -500.        | -500.        |  2.4528      |
|    18 |  1000       | -500.        | -500.        | -1.17858     |
|    19 |  1000       | -500.        | -500.        |  1.03199     |
+-------+-------------+--------------+--------------+--------------+
 

After this, I knew I could load the data but I really wanted to run the files through my old analysis code in FORTRAN.

Rather than try to compile the old project, I created a simple FORTRAN program to try and load the ntuple:

jonathan@jonathan-Inspiron-5759:~/Development/Physics/New Stuff/Development$ more Zeus1.f
      program main

      print *, "Test loading ntuple files"
     
      call hlimit(2000000)      
      call hropen(80,'ntuple','nommc_1033.rz','',4096,istat)
 
      if (istat.ne.0) then
          print *, "Failed to open input file"
         stop
      endif

      print *, "Loaded nommc_1033.rz"
      end program main

(I'll talk about these commands in the next blog)

Compilation is easy with the official repository, we just need to link against packlib:

gfotrtran Zeus1.f  -lpacklib

And with no problems it created the default executable a.out (that takes me back!)

So with huge anticipation I ran it and, oh...

jonathan@jonathan-Inspiron-5759:~/Development/Physics/New Stuff/Development$ ./a.out
 Test loading ntuple files
!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
LOCB/LOCF: address 0x7f57832df7a0 exceeds the 32 bit address space
or is not in the data segments
This may result in program crash or incorrect results
Therefore we will stop here
!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
 
The reason was to be found in the readme file in the same location as the libraries:
/usr/share/doc/libpacklib1-dev/README.64-bit

CERNLIB was never designed to run on machines where the size of a pointer
is 64 bits.  The code implicitly assumes that
 sizeof(void *) == sizeof(int) == 4
in many different places.  The biggest culprits are the ZEBRA and COMIS sets
of routines in the packlib and pawlib libraries, respectively.  This would be
difficult to fix without rewriting megabytes of FORTRAN source code.

My old 2002 laptop was of course 32bit so this wasn't a problem before. My next blog will describe how I got this working.

Tuesday, 23 April 2019

Germinating Seeds

Here's a quick an simple way to view a seed germinating.

All it took was kitchen towel, a straight glass, two sunflower seeds and some water.

Take 4 or 5 sheets of kitchen towel, slightly damp and roll into a cylinder. We're going to put this into the glass.

On the side of the kitchen towel, where it would be about 3/4 of the way up the glass, put the seed. We put two in, on opposite sides.

Now, place the kitchen towel into the glass and fill the bottom with water up to a depth of about 2cm.

At this point the glass was put on a shelf and left alone apart from topping up the water once the kitchen towel fully absorbed it.

After 4 days a root appeared and headed down very quickly, over 2cm a day.
We had placed one seed point up and the other point down. This had no discernible effect as the root appeared more from the side.

Then after another 2 days, the leaf came out.

Here it is, just before I planted out into a pot


The seed itself required water, warmth and oxygen to germinate. By keeping it damp, but not submerged, the seed took in a large amount of water to allow it to soften and swell the coating as well as trigger the internal metabolism.