Tags:
create new tag
view all tags

Magnetic Extrapolation Deployment

Science Testing Document

Input

  • The algorithm takes as its input a FITS file containing Bx, By and Bz magnetic field components (in that order) at the Sun's photosphere. These boundary data are derived from processed Vector Magnetograms, i.e. magnetograms which have had the Pi Ambiguity resolved and then undergone normalization to remove any anomalous data that may prevent the Magnetic Extrapolation from producing a convergent result.

    The boundary data are held in a single column within a binary table extension.

    An example of a Magnetic Field Extrapolation FITS input file is shown here: boundary_data.fits

Compilation and Execution:

Note: see http://www.mssl.ucl.ac.uk/twiki/bin/view/SDO/RunningNGSCode for instructions on compilation and execution on the UK's National Grid Service clusters.

A makefile (Makefile) is provided in the installation package to simplify building of the algorithm. Running 'make' will create an executable called relax2 (and a shared library called libMagExtrap.so for IDL support). Please refer to the README file in the package for details of how to install the software.

  • commandline execution: %
    mpiexec -n <no cpus> relax2 <calc_type> <no  iterations> <boundary data filename> <init data filename> <output filename> 
e.g. mpiexec -n 2 relax2 22 10000 boundary_data.fits grid.ini nlfff.fits

  • no cpus is a positive integer used to indicate the number of processors to be used in running the algorithm. This parameter (and the mpiexec -n prefix) can be omitted if the algorithm is to be run on a single cpu.

  • calc_type is a positive integer used to trigger different calculation options within the algorithm. The algorithm requires two steps in calculating the final Non-Linear Force Free Field (NLFFF) output: first it computes a Potential Field using the Bz component data from the input FITS file (storing the resultant data in a non-FITS binary file); it then performs the Optimization computation using the calculated Potential Field to derive the NLFFF.

    Both steps may be performed in sequence or in isolation depending upon the following options:

    • 22 Compute Potential Field + Optimization
    • 23 Compute Potential Field only
    • 20 Optimization only (requires Potential Field data to be have been pre-computed)

      Both Potential Field and Optimization steps write debug information to the standard output as they execute.

  • no iterations defines the number of iterations to be performed during the Optimization step. NLFFF computation is a convergent process and depends on the number of iterations to achieve convergence. Clearly a slowly converging dataset requires more iterations and becomes more computer intensive. If convergence is achieved before the specified number of iterations is reached, then the algorithm terminates.

  • boundary data filename is the name of the input FITS file containing the Bx, By and Bz boundary data from the Sun's photosphere. The full pathname is required if the file isn't in the same directory as the algorithm executable.

  • init data filename is an Ascii file that contains the dimensions of the Bx, By and Bz boundary data. It also contains an integer term used to define the size of non-photospheric boundary to be used in the NLFFF computation.

  • output_data filename is the filename of the FITS format output data file containing the extrapolated NLFFF data. The file may be given any name by the user and doesn't have to reside in the same directory as the executable, but it must be prefixed with the relative or absolute pathname.

---+++ Expected Output

  • The algorithm produces a FITS file with a binary table extension consisting of 3 columns containing the Bx, By and Bx magnetic field data components of the computed NLFFF. The FITS file may take any name given by the user and does not need to reside in the same directory as the executable.

    An example of a Magnetic Field Extrapolation FITS output file is shown here: optimization.fits

Current level of completion

  • The algorithm has been tested using simulated single and 6-side boundary datasets based on Low-Lou and Titov-Demoulin models. These were supplied as Ascii files (and IDL .sav files converted to Ascii) and converted to FITS file format in accordance with eSDO requirements. The code has been parallelized using the Message Passing Interface (MPI) so that the algorithm may be run as a distributed process and so speed-up computation times. This will become essential when processing large datasets derived from Hinode and SDO Vector Magnetograms.

To date the algorithm has been run on desktop computers at MSSL and the National Grid Service (NGS).

Future work

Run the algorithm on the NGS Phase-2 clusters, which will be able to accomodate larger datasets such as Hinode.

Remove grid.ini and references to it. Some of the information it contains is duplicated in the FITS input file, while the other parameters could be entered as input arguments to the algorithm.

Incorporate Thomas Wiegelmann's latest preprocessing algorithm as 'C' code.

Create a visualization tool for the NLFFF output.

Correlate NLFFF output with the Loop Recognition algorithm.

Investigate ways of reducing the number and size of MPI data transfers.

Science Test Cases

See MagneticExtrapolationDeploymentResults

For all test case input files, download: http://msslxx.mssl.ucl.ac.uk:8080/eSDO/MagneticExtrapolation_1.0_testdata.tar

Case 1: Low and Lou case #1

Description

Analytical force-free model developed by B.C. Low and Y.Q. Lou in 1990 and used for evaluating the 6 NLFF magnetic field extrapolation methods presented at the NLFFF workshop in 2005. Uses boundary data from all 6 boundaries of the Low-Lou solution.

Input

  • Initialisation data for Low and Lou test case #1

    MagneticExtrapolation/test/simulated/LowLouCase1.grid_ini (see Test Case download file)

  • Boundary data for Low and Lou test case #1

    MagneticExtrapolation/test/simulated/LowLouCase1.fits (see Test Case download file)

The dataset provides boundary data for all 6 boundaries of a 643 pixel cubic volume bounded by x, y ∈ [-1, +1] and z ∈ [0, 2]. Computation performed on a 64 * 64 grid.

eigenvalues:
nLL = 1
mLL = 1

point source location:
l = 0.3
Φ = π/4

Expected Output

Binary file containing computed NLFFF. Result needs to compared with Wiegelmann's solution, which has been verified against the Low and Lou model solution. It should show greater non-potential overall than case #2.

Case 2: Low and Lou case #2

Description

Analytical force-free model developed by B.C. Low and Y.Q. Lou in 1990 and used for evaluating the 6 NLFF magnetic field extrapolation methods presented at the NLFFF workshop in 2005. Uses boundary data from the bottom boundary (the photosphere) of the Low-Lou solution.

Input

  • Initialisation data for Low and Lou test case #2

    MagneticExtrapolation/test/simulated/LowLouCase2.grid_ini (see Test Case download file)

  • Boundary data for Low and Lou test case #2

    MagneticExtrapolation/test/simulated/LowLouCase2.fits (see Test Case download file)

The dataset provides boundary data for the bottom boundary of a 643 pixel cubic volume bounded by x, y ∈ [-3, +3]. Computation performed on a 192 * 192 pixel grid centred on the 643 pixel test region.

eigenvalues:
nLL = 3
mLL = 1

point source location:
l = 0.3
Φ = 4π/5

Expected Output

Binary file containing computed NLFFF. Result needs to compared with Wiegelmann's solution, which has been verified against the Low and Lou model solution. It should show greater non-potential at the centre of the model than case #1, although less non-potential overall.

Case 3: Titov Demoulin case #1

Description

Analytical force-free model equilibrium model by V.S. Titov and P Demoulin in 1999. The model is configured to simulate a current-carrying flux rope in an active region. In this case, the exact equilibrium data for all 6 boundaries of the computational box are available.

Input

  • Initialisation data for Titov and Demoulin test case #1

    MagneticExtrapolation/test/simulated/TitovDemoulinCase1.grid_ini (see Test Case download file)

  • Boundary data for Titov and Demoulin test case #1

    MagneticExtrapolation/test/simulated/TitovDemoulinCase1.fits (see Test Case download file)

TD equilibria are:

R = 110Mm = 2.2. (major torus radius)
a = 35 Mm = 0.7 (minor torus radius)
d = 50 Mm = 1.0 (depth of torus centre)
L = 100 Mm = 2.0 (monopole distance)
q = 100 T Mm2 (magnetic charge)
I0 = -13 TA (line current)
hapex = 1.2 (apex height)
yfoot = 1.960 (footpoint position)
Bapex = 1/0 (normal apex field strength)
(Φ = -1.41π (average twist)

Computation done on a 150 * 250 * 100 3-D grid, with a boundary llayer of 16 points set on the lateral and upper boundaries.

Weighting factor is set to 1.0 throughout the computational box.

Expected Output

Output file too large to upload to Twiki.

Binary file containing computed NLFFF. Result needs to compared with Wiegelmann's solution, which has been verified against the the Titov and Demoulin equilibrium model. Result should show excellent correlation with the Titov Demoulin Equilibrium.

Case 4: Titov Demoulin case #2

Description

Analytical force-free model equilibrium model by V.S. Titov and P Demoulin in 1999. The model is configured to simulate a current-carrying flux rope in an active region. In this case, the exact equilibrium data for the photospheric boundary only is made available.

Input

  • Initialisation data for Titov and Demoulin test case #2

    MagneticExtrapolation/test/simulated/TitovDemoulinCase2.grid_ini (see Test Case download file)

  • Boundary data for Titov and Demoulin test case #2

    MagneticExtrapolation/test/simulated/TitovDemoulinCase2.fits (see Test Case download file)

TD equilibria are:

Same as case #1

Computation done on a 150 * 250 * 100 3-D grid, with a boundary layer of 0 set for the lateral and upper boundaries.

Weighting factor is set to 1.0 in the central 118 * 218 * 84 region of the computational box. This falls to zero with a cosine profile towards the lateral and top boundaries.

Expected Output

Output file too large to upload to Twiki.

Binary file containing computed NLFFF. Result needs to compared with Wiegelmann's solution, which has been verified against the the Titov and Demoulin equilibrium model. Result reconstructs the Titov Demoulin equilibrium only approximately. Divergence is greatest where the magnetic fields emerge from the lower boundary.

Case 5: Hinode datasets

Description

Input

Expected Output

Unit Testing

  • gcc compilation: A makefile (Makefile.test) is provided in the installation package to enable a unit test executable called MagExtrapTests to be built. When executed a series of internal checks are run and the results printed on the standard output.

  • commandline execution: %
    MagExtrapTests

Classes with unit tests:

  • bfield.c .............. (21 tests)
  • dataio.c .............. (2 tests)
  • init.c ................... (1 test)
  • loop.c ................. (6 tests)
  • optimization.c ..... (2 tests)
  • relax1.c .............. (1 test)

Running the algorithm from AstroGrid

AstroGrid workflow instructions:

  1. Open AstroGrid workbench and click "Task Launcher"
  2. Tasks: find application, specify variables and file as input, specify files as output, launch
  3. Task Launcher search: magnetic extrapolation or "Solar Magnetic Extrapolation"

Running the algorithm from IDL

The algorithm may be run from within IDL using the IDL wrapper MagExtrap.pro provided in the installation package. The input

parameters are the same as for the commandline version. For example:

  • idl> .run MagExtrap.pro
  • idl> MAG_EXTRAP_WRAP, 22, 10000, 'LowLouCase2.fits', 'LowLouCase2.grid_ini', 'output.fits'

Please note: The libMagExtrap.so shared library is required by the IDL wrapper. Also, IDL doesn't support the running of the algorithm on multiple processors.

Message Passing Interface (MPI)

Background to Parallel Processing

Computation of a magnetic Non-Linear Force Free Field (NLFFF) is a computer intensive activity. Even the fastest NLFFF algorithms, such as Wiegelmann's 'Optimization with weighting method', would take an estimated 8000 CPU hours to extrapolate a field from a 512 pixel datacube.

Such intensive processing requires a parallel processing approach in order to reduce the computation to manageable levels. Wiegelmann's code, for example, makes use of OPENMP, a set of shared-memory parallel processing APIs that support C/C++ and FORTRAN routines (on a variety of OS platforms) and enable the computational load to be split automatically among available processors. Such a system is used routinely on Cray Supercomputers.

Little modification is required to make software OPENMP compatible: the user simply adds to the code OPENMP directives declaring which variables are to be shared amongst the processors - OPENMP does the rest.

Of course, not all institutions have access to parallel processing facility such as a Cray Supercomputer. Fortunately, there is an alternative called the Message Passing Interface (MPI), an API that supports parallel processing across a distributed memory system, such as a cluster - an array of connected computers used for intensive computing tasks. Most research establishments have one or more computer clusters installed and MPI is widely supported.

The advantage of MPI is that it can be run without the need for expensive, specialised hardware. The drawback is that MPI puts the onus on the user in deciding how to split the computation across processors, and may require substantial software changes.

MPICH2

Part of the eSDO-MSSL remit is to make Thomas Wiegelmann's NLFFF algorithm MPI compatible and, ultimately, available through AstroGrid and the National Grid Service (NGS), which supports MPI. The NGS is an e-Science GRID collaboration that provides researchers with on-line access to various computational and data resources.

There are numerous implementations of MPI available. A free software version called MPICH2 (which conforms to the MPI-2.0 specification) was chosen for MSSL in-house experiments with MPI. NGS supports MPICH1 but the small subset of MPICH2 methods used in our software are backwards compatible with the earlier version and should pose no problem when run on NGS.

Experiments on MSSL machines

Installation

Three MSSL servers were chosen for the MPI experiments:

  • msslxs.mssl.ucl.ac.uk
  • msslxt.mssl.ucl.ac.uk
  • msslxx.mssl.ucl.ac.uk

It is important when running MPI compatible software to ensure that each server has similar resources (memory, disk-space and CPUs of comparable processing power with similar loading). This ensures that the workload is evenly spread between CPUs. The servers listed above were chosen because they meet these criteria.

The following software was installed on each of the machines:

  • Version mpich2-1.0.3 of MPICH2 (built from source).
  • Latest Magnetic Extrapolation NLFFF code (from CVS).

Experiments with Potential Field

The NLFFF code runs in 2 distinct phases:

  1. Computes a Potential Field
  2. Computes the NLFFF using an iterative Optimization process.

The Potential Field calculation based on the LowLou Science Case #2 (photospheric boundary only) takes about 4 min 30 secs to complete on a single CPU (msslxx, the eSDO server). The calculation consists of 5 nested loops, where the size of each loop is determined by the dimensions of the Bx, By, Bz photospheric data cube used as input. For LowLou Science Case #2 this amounts to over 2 billion calculations, making it a prime candidate for parallel treatment and a good starting point for our MPI experiments.

The Potential Field calculation is split equally among multiple processors by sub-dividing the outermost loop by the number of CPUs available and summing the results of each 'slice' using one of the MPI functions designed for this purpose. In this instance each 'slice' can be computed independently, i.e. values from other CPU slices aren't needed.

The MPI modifed Potential Field calculation was run using 1, 2 then 3 CPUs and the processing times noted. The outputs of each run - B0.bin, a binary file containing the Potential Field - were compared to ensure the same results were obtained in each case. The results are shown in Table 1.

Results

Table 1 - Potential Field calculation times (LowLou Science Case #2)

CPUs Time (secs)
msslxx 265
msslxx + msslxt 134
msslxx + msslxt + msslxs 91

Experiments with Optimization

Deferred to NGS (see below)

Experiments using the National Grid Service (NGS)

For details see http://www.mssl.ucl.ac.uk/twiki/bin/view/SDO/MagExtrapNGS

MultiGrid-like approach

In order to reduce the running time of the Magnetic Extrapolation algorithm, Dr Wiegelmann introduced a MultiGrid-like method for running the code. This involves no change to the algorithm but requires that the NLFFF output is rebinned to by a factor of 2 along each axis to form the the potential field input for another run of the algorithm using a larger boundary dataset. This also requires that the original boundary dataset is rebinned to produce 2 smaller datasets.

Take the Hinode H20061212_2030 dataset, for example, originally a 320*320*256 datacube generated from an Hinode vector magnetogram. The datacube was rebinned by a factor of 0.5 along each axis to produce an intermediate dataset, and the process repeated to produce a smaller dataset still. The 3 files (original + 2 rebinned files) were provided to MSSL in Ascii boundary data format, converted to FITS format and have been included in installation package, where:

The H20061212_2030_1.fits is an 80*80*64 Bfield datacube The H20061212_2030_2.fits is an 160*160*128 datacube The H20061212_2030_1.fits is an 320*320*256 datacube

For each file there is a corresponding grid*.ini file which contains the dimensions of its respective datacube.

The execution sequence would be as follows:

  1. relax2 22 10000 grid1.ini H20061212_2030_1.fits out.fits
  2. rebin out.fits B0.bin
  3. relax2 20 10000 grid2.ini H20061212_2030_2.fits out.fits
  4. rebin out.fits B0.bin
  5. relax2 20 10000 grid3.ini H20061212_2030_3.fits out.fits

The out.fits file produced in step 5 is the final NLFFF output.

Please note that in steps 3 and 5, only the optimization calculation is run. The potential field is calculated in step 1 only. The potential field used in steps 3 and 5 is derived from the rebinning the output of previous optimization calculations.

This approach of rebinning the main dataset into smaller dataset and processing these in turn (smallest to largest) and using the rebinned NLFFF output as the potential field for the next largest dataset produces a better "guess" or starting point for the optimization process at each step, and allows a convergent NLFFF solution to be achieved more quickly for large datasets. The MultiGrid-like method is aimed at real solar datasets which have had the Pi ambiguity removed and been preprocessed, as is the case with the Hinode datasets provided.

The dynamic memory allocation required from the largest grid (step 5) is considerable (over 5GB per cpu), so please ensure that your local machine(s) or cluster has adequate memory.

Pre-processing

Tools

NOTE: All .fits files have been moved from the attachments table to http://msslxx.mssl.ucl.ac.uk:8080/eSDO/algorithms/MagExtrap/MagExtrap.html.

-- MikeSmith - 17 Sep 2007

Topic attachments
I Attachment History Action Size Date Who Comment
Unknown file formatsav_ll1 Bout.sav.sav_ll1 r1 manage 3073.7 K 2006-12-15 - 19:10 MikeSmith NLFFF output using Low and Lou test case #1 (Wiegelmann)
Unknown file formatsav_ll2 Bout.sav.sav_ll2 r1 manage 3073.9 K 2006-12-15 - 19:14 MikeSmith NLFFF output using Low and Lou test case #2 (Wiegelmann)
Unknown file formatdat_ll1 allboundaries.dat_ll1 r1 manage 1296.0 K 2006-12-15 - 19:09 MikeSmith Boundary data for Low and Lou test case #1.
Unknown file formatdat_ll2 allboundaries.dat_ll2 r1 manage 337.5 K 2006-12-15 - 19:13 MikeSmith Boundary data for Low and Lou test case #2.
Unknown file formatdat_td1 allboundaries.dat_td1 r1 manage 4973.4 K 2006-12-15 - 19:18 MikeSmith Boundary data for Titov and Demoulin case #1.
Unknown file formatdat_td2 allboundaries.dat_td2 r1 manage 4973.4 K 2006-12-15 - 19:22 MikeSmith Boundary data for Titov and Demoulin case #2.
Unknown file formatini_ll1 grid.ini_ll1 r1 manage 0.1 K 2006-12-15 - 19:09 MikeSmith Initialisation data for Low and Lou test case #1
Unknown file formatini_ll2 grid.ini_ll2 r1 manage 0.1 K 2006-12-15 - 19:12 MikeSmith Initialisation data for Low and Lou case #2.
Unknown file formatini_td1 grid.ini_td1 r1 manage 0.1 K 2006-12-15 - 19:17 MikeSmith Initialisation data for Titov and Demoulin case #1.
Unknown file formatini_td2 grid.ini_td2 r1 manage 0.1 K 2006-12-15 - 19:21 MikeSmith Initialisation data for Titov and Demoulin case #2.
Edit | Attach | Watch | Print version | History: r30 < r29 < r28 < r27 < r26 | Backlinks | Raw View | More topic actions
Topic revision: r30 - 2008-01-15 - MikeSmith
 
This site is powered by the TWiki collaboration platform Powered by PerlCopyright © 2008-2017 by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
Ideas, requests, problems regarding TWiki? Send feedback