--- /dev/null
+<?xml version="1.0" encoding="UTF-8"?>\r
+<classpath>\r
+ <classpathentry exported="true" kind="lib" path="jarhdf5-1.10.0.jar" sourcepath="C:/work/hdf5/CMake-hdf5-1.10.0-patch1/hdf5-1.10.0-patch1/java/src"/>\r
+ <classpathentry kind="con" path="org.eclipse.jdt.launching.JRE_CONTAINER/org.eclipse.jdt.internal.debug.ui.launcher.StandardVMType/JavaSE-1.8"/>\r
+ <classpathentry kind="con" path="org.eclipse.pde.core.requiredPlugins"/>\r
+ <classpathentry kind="src" path="src"/>\r
+ <classpathentry kind="output" path="bin"/>\r
+</classpath>\r
--- /dev/null
+<?xml version="1.0" encoding="UTF-8"?>\r
+<projectDescription>\r
+ <name>hdf.hdf5lib</name>\r
+ <comment></comment>\r
+ <projects>\r
+ </projects>\r
+ <buildSpec>\r
+ <buildCommand>\r
+ <name>org.eclipse.jdt.core.javabuilder</name>\r
+ <arguments>\r
+ </arguments>\r
+ </buildCommand>\r
+ <buildCommand>\r
+ <name>org.eclipse.pde.ManifestBuilder</name>\r
+ <arguments>\r
+ </arguments>\r
+ </buildCommand>\r
+ <buildCommand>\r
+ <name>org.eclipse.pde.SchemaBuilder</name>\r
+ <arguments>\r
+ </arguments>\r
+ </buildCommand>\r
+ </buildSpec>\r
+ <natures>\r
+ <nature>org.eclipse.pde.PluginNature</nature>\r
+ <nature>org.eclipse.jdt.core.javanature</nature>\r
+ </natures>\r
+</projectDescription>\r
--- /dev/null
+eclipse.preferences.version=1\r
+org.eclipse.jdt.core.compiler.codegen.inlineJsrBytecode=enabled\r
+org.eclipse.jdt.core.compiler.codegen.targetPlatform=1.8\r
+org.eclipse.jdt.core.compiler.compliance=1.8\r
+org.eclipse.jdt.core.compiler.problem.assertIdentifier=error\r
+org.eclipse.jdt.core.compiler.problem.enumIdentifier=error\r
+org.eclipse.jdt.core.compiler.source=1.8\r
--- /dev/null
+\r
+Copyright Notice and License Terms for \r
+HDF5 (Hierarchical Data Format 5) Software Library and Utilities\r
+-----------------------------------------------------------------------------\r
+\r
+HDF5 (Hierarchical Data Format 5) Software Library and Utilities\r
+Copyright 2006-2016 by The HDF Group.\r
+\r
+NCSA HDF5 (Hierarchical Data Format 5) Software Library and Utilities\r
+Copyright 1998-2006 by the Board of Trustees of the University of Illinois.\r
+\r
+All rights reserved.\r
+\r
+Redistribution and use in source and binary forms, with or without \r
+modification, are permitted for any purpose (including commercial purposes) \r
+provided that the following conditions are met:\r
+\r
+1. Redistributions of source code must retain the above copyright notice, \r
+ this list of conditions, and the following disclaimer.\r
+\r
+2. Redistributions in binary form must reproduce the above copyright notice, \r
+ this list of conditions, and the following disclaimer in the documentation \r
+ and/or materials provided with the distribution.\r
+\r
+3. In addition, redistributions of modified forms of the source or binary \r
+ code must carry prominent notices stating that the original code was \r
+ changed and the date of the change.\r
+\r
+4. All publications or advertising materials mentioning features or use of \r
+ this software are asked, but not required, to acknowledge that it was \r
+ developed by The HDF Group and by the National Center for Supercomputing \r
+ Applications at the University of Illinois at Urbana-Champaign and \r
+ credit the contributors.\r
+\r
+5. Neither the name of The HDF Group, the name of the University, nor the \r
+ name of any Contributor may be used to endorse or promote products derived \r
+ from this software without specific prior written permission from \r
+ The HDF Group, the University, or the Contributor, respectively.\r
+\r
+DISCLAIMER: \r
+THIS SOFTWARE IS PROVIDED BY THE HDF GROUP AND THE CONTRIBUTORS \r
+"AS IS" WITH NO WARRANTY OF ANY KIND, EITHER EXPRESSED OR IMPLIED. In no \r
+event shall The HDF Group or the Contributors be liable for any damages \r
+suffered by the users arising out of the use of this software, even if \r
+advised of the possibility of such damage. \r
+\r
+-----------------------------------------------------------------------------\r
+-----------------------------------------------------------------------------\r
+\r
+Contributors: National Center for Supercomputing Applications (NCSA) at \r
+the University of Illinois, Fortner Software, Unidata Program Center (netCDF), \r
+The Independent JPEG Group (JPEG), Jean-loup Gailly and Mark Adler (gzip), \r
+and Digital Equipment Corporation (DEC).\r
+\r
+-----------------------------------------------------------------------------\r
+\r
+Portions of HDF5 were developed with support from the Lawrence Berkeley \r
+National Laboratory (LBNL) and the United States Department of Energy \r
+under Prime Contract No. DE-AC02-05CH11231.\r
+\r
+-----------------------------------------------------------------------------\r
+\r
+Portions of HDF5 were developed with support from the University of \r
+California, Lawrence Livermore National Laboratory (UC LLNL). \r
+The following statement applies to those portions of the product and must \r
+be retained in any redistribution of source code, binaries, documentation, \r
+and/or accompanying materials:\r
+\r
+ This work was partially produced at the University of California, \r
+ Lawrence Livermore National Laboratory (UC LLNL) under contract \r
+ no. W-7405-ENG-48 (Contract 48) between the U.S. Department of Energy \r
+ (DOE) and The Regents of the University of California (University) \r
+ for the operation of UC LLNL.\r
+\r
+ DISCLAIMER: \r
+ This work was prepared as an account of work sponsored by an agency of \r
+ the United States Government. Neither the United States Government nor \r
+ the University of California nor any of their employees, makes any \r
+ warranty, express or implied, or assumes any liability or responsibility \r
+ for the accuracy, completeness, or usefulness of any information, \r
+ apparatus, product, or process disclosed, or represents that its use \r
+ would not infringe privately- owned rights. Reference herein to any \r
+ specific commercial products, process, or service by trade name, \r
+ trademark, manufacturer, or otherwise, does not necessarily constitute \r
+ or imply its endorsement, recommendation, or favoring by the United \r
+ States Government or the University of California. The views and \r
+ opinions of authors expressed herein do not necessarily state or reflect \r
+ those of the United States Government or the University of California, \r
+ and shall not be used for advertising or product endorsement purposes.\r
+\r
+-----------------------------------------------------------------------------\r
+\r
+HDF5 is available with the SZIP compression library but SZIP is not part \r
+of HDF5 and has separate copyright and license terms. See \93Szip Compression \r
+in HDF Products\94 (www.hdfgroup.org/doc_resource/SZIP/) for further details.\r
+\r
+-----------------------------------------------------------------------------\r
+\r
--- /dev/null
+Manifest-Version: 1.0
+Bundle-ManifestVersion: 2
+Bundle-Name: Java HDF5 Interface (JHI5)
+Bundle-SymbolicName: hdf.hdf5lib
+Bundle-Version: 1.10.0.patch1
+Bundle-Vendor: Semantum Oy
+Bundle-RequiredExecutionEnvironment: JavaSE-1.8
+Bundle-ClassPath: .,
+ jarhdf5-1.10.0.jar
+Export-Package: hdf.hdf5lib,
+ hdf.hdf5lib.callbacks,
+ hdf.hdf5lib.exceptions,
+ hdf.hdf5lib.structs
+Bundle-NativeCode: hdf5_java.dll; processor=x86_64; osname=win32
+Require-Bundle: org.slf4j.api;bundle-version="1.7.2"
--- /dev/null
+The Java HDF5 interface library and the accompanying hdf5_java.dll JNI library\r
+are built without both separately licensed ZLIB and SZIP open source libraries.\r
+The native library (DLL) has been compiled with Visual Studio 2015 and dynamic\r
+linking.
\ No newline at end of file
--- /dev/null
+<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Strict//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-strict.dtd">\r
+<html xmlns="http://www.w3.org/1999/xhtml">\r
+<head>\r
+<meta http-equiv="Content-Type" content="text/html; charset=UTF-8">\r
+<title>About</title>\r
+</head>\r
+<body lang="EN-US">\r
+<h2>About This Content</h2>\r
+ \r
+<p>August 29th, 2016</p>\r
+<h3>License</h3>\r
+\r
+ <h3>Third Party Content</h3>\r
+ <p>The Content includes items that have been sourced from third parties as set out below.</p>\r
+ <p><em>\r
+ <strong>Java HDF5 Interface (JHI5), version 1.10.0-patch1</strong> <br><br>\r
+ This library has been obtained from the HDF Group\r
+ (https://www.hdfgroup.org/HDF5/release/obtain5110.html) unmodified.\r
+ See file COPYING for more information on the license.\r
+ </em></p>\r
+\r
+</body></html>
\ No newline at end of file
--- /dev/null
+source.. = src/\r
+output.. = bin/\r
+bin.includes = META-INF/,\\r
+ .,\\r
+ jarhdf5-1.10.0.jar,\\r
+ hdf5_java.dll,\\r
+ COPYING,\\r
+ README.md,\\r
+ about.html\r
+src.includes = COPYING\r
--- /dev/null
+<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
+ xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
+ <modelVersion>4.0.0</modelVersion>
+ <artifactId>hdf.hdf5lib</artifactId>
+ <packaging>eclipse-plugin</packaging>
+ <version>1.10.0.patch1</version>
+
+ <parent>
+ <groupId>org.simantics</groupId>
+ <artifactId>org.simantics.root.bundles</artifactId>
+ <version>1.0.0-SNAPSHOT</version>
+ <relativePath>..</relativePath>
+ </parent>
+
+ <build>
+ <plugins>
+ <plugin>
+ <groupId>org.eclipse.tycho</groupId>
+ <artifactId>target-platform-configuration</artifactId>
+ <configuration>
+ <environments>
+ <environment>
+ <os>win32</os>
+ <arch>x86_64</arch>
+ </environment>
+ </environments>
+ </configuration>
+ </plugin>
+ </plugins>
+ </build>
+
+</project>
\ No newline at end of file
+++ /dev/null
-package org.simantics.acorn.internal;
-
-
-public class ClusterChange2 {
- public static final int VERSION = 2;
- public static final byte SET_IMMUTABLE_OPERATION = 1; // <byte : 0 = false>
- public static final byte UNDO_VALUE_OPERATION = 2; // <int : resource index>
- private static final int INCREMENT = 1<<10;
-// private boolean dirty = false;
-// private byte[] bytes;
-// private int byteIndex;
-// private ClusterUID clusterUID;
-// private ClusterImpl cluster;
-// ClusterChange2(ClusterUID clusterUID, ClusterImpl cluster) {
-// this.clusterUID = clusterUID;
-// this.cluster = cluster;
-// init();
-// }
-// void init() {
-//// System.err.println("clusterChange2 dirty " + cluster.clusterId);
-// dirty = false;
-// bytes = new byte[INCREMENT];
-// byteIndex = 0;
-// addInt(0); // Size of byte vector. Set by flush.
-// addInt(VERSION);
-// byteIndex = clusterUID.toByte(bytes, 8);
-// }
-// boolean isDirty() {
-// return dirty;
-// }
-// void flush(GraphSession graphSession) {
-//// System.err.println("flush2 clusterChange2 " + dirty + this);
-// if (!dirty)
-// return;
-// Bytes.writeLE(bytes, 0, byteIndex - 4);
-// byte[] ops = Arrays.copyOf(bytes, byteIndex);
-//// System.err.println("flush2 clusterChange2 " + cluster.clusterId + " " + ops.length + " bytes.");
-// graphSession.updateCluster(new UpdateClusterFunction(ops));
-// init();
-// }
-// void setImmutable(boolean immutable) {
-// dirty = true;
-// addByte(SET_IMMUTABLE_OPERATION);
-// addByte((byte)(immutable ? -1 : 0));
-// }
-// void undoValueEx(int resourceIndex) {
-// dirty = true;
-// addByte(UNDO_VALUE_OPERATION);
-// addInt(resourceIndex);
-// }
-// private final void checkSpace(int len) {
-// if (bytes.length - byteIndex > len)
-// return;
-// bytes = Arrays.copyOf(bytes, bytes.length + len + INCREMENT);
-// }
-// private final void addByte(byte value) {
-// checkSpace(1);
-// bytes[byteIndex++] = value;
-// }
-// private final void addInt(int value) {
-// checkSpace(4);
-// Bytes.writeLE(bytes, byteIndex, value);
-// byteIndex += 4;
-// }
-//// private void addLong(long value) {
-//// checkSpace(8);
-//// Bytes.writeLE(bytes, byteIndex, value);
-//// byteIndex += 8;
-//// }
-}
}
+ @Override
+ void setImmutable(boolean value) {
+ cluster.setImmutable(value, support);
+ }
+
public ClusterImpl process(ClusterImpl cluster) throws IllegalAcornStateException {
this.cluster = cluster;
process();
info.finish();
return this.cluster;
}
-
+
}
+++ /dev/null
-package org.simantics.acorn.internal;
-
-import org.simantics.acorn.cluster.ClusterImpl;
-import org.simantics.acorn.exception.IllegalAcornStateException;
-import org.simantics.acorn.lru.ClusterUpdateOperation;
-import org.simantics.db.impl.ClusterSupport;
-
-public class ClusterUpdateProcessor2 extends ClusterUpdateProcessorBase2 {
-
- final ClusterSupport support;
- final ClusterUpdateOperation info;
- private ClusterImpl cluster;
-
- public ClusterUpdateProcessor2(ClusterSupport support, byte[] operations, ClusterUpdateOperation info) {
- super(operations);
- this.support = support;
- this.info = info;
- }
-
- public void process(ClusterImpl cluster) throws IllegalAcornStateException {
- this.cluster = cluster;
- process();
- info.finish();
- }
-
- @Override
- void setImmutable(boolean value) {
- cluster.setImmutable(value, support);
- }
-
-}
import org.simantics.db.exception.DatabaseException;
import org.simantics.db.service.Bytes;
import org.simantics.db.service.ClusterUID;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+import fi.vtt.simantics.procore.internal.ClusterChange2;
abstract public class ClusterUpdateProcessorBase {
-
+
+ private static final Logger LOGGER = LoggerFactory.getLogger(ClusterUpdateProcessorBase.class);
+
public final static boolean DEBUG = false;
final protected ClusterManager manager;
final private ClusterUID uid;
final private int clusterKey;
final public int version;
-
- final Map<ClusterUID, Integer> clusterKeyCache = new HashMap<ClusterUID, Integer>();
-
+
+ final Map<ClusterUID, Integer> clusterKeyCache = new HashMap<>();
+
public int getResourceKey(ClusterUID uid, int index) throws IllegalAcornStateException {
Integer match = clusterKeyCache.get(uid);
if(match != null) return match+index;
this.manager = client;
this.bytes = operations;
this.len = Bytes.readLE4(bytes, 0)+4; // whatta?
- version = Bytes.readLE4(bytes, 4);
+ this.version = Bytes.readLE4(bytes, 4);
long cuid1 = Bytes.readLE8(bytes, 8);
long cuid2 = Bytes.readLE8(bytes, 16);
uid = ClusterUID.make(cuid1, cuid2);
try {
create();
} catch (DatabaseException e) {
- e.printStackTrace();
+ LOGGER.error("resource create failed", e);
}
-
}
-
+
private void processDelete() {
-
int ri = Bytes.readLE2(bytes, pos);
pos += 2;
-
+
if(DEBUG) System.err.println("DEBUG: Delete " + ri);
-
+
try {
delete(ri);
} catch (DatabaseException e) {
- e.printStackTrace();
+ LOGGER.error("resource {} value delete failed", ri, e);
}
-
}
private void processModify(int op) {
try {
modify(clusterKey + ri, offset, size, bytes, pos);
} catch (DatabaseException e) {
- e.printStackTrace();
+ LOGGER.error("resource value modify(clusterKey: {}, ri: {}, offset: {}, size: {}, pos: {}) failed",
+ clusterKey, ri, offset, size, pos, e);
}
pos += size;
try {
set(clusterKey+r, valueBuffer, length);
} catch (DatabaseException e) {
- e.printStackTrace();
+ LOGGER.error("resource value set(clusterKey: {}, r: {}, length: {}) failed",
+ clusterKey, r, length, e);
}
}
try {
set(clusterKey+r, valueBuffer, length);
} catch (DatabaseException e) {
- e.printStackTrace();
+ LOGGER.error("resource value setShort(clusterKey: {}, r: {}, length: {}) failed",
+ clusterKey, r, length, e);
}
}
try {
claim(clusterKey+ri, predicateKey, objectKey, puid, ouid);
} catch (DatabaseException e) {
- e.printStackTrace();
+ LOGGER.error("statement add(clusterKey: {}, ri: {}, predicateKey: {}, objectKey: {}, puid: {}, ouid: {}) failed",
+ clusterKey, ri, predicateKey, objectKey, puid.toString(), ouid.toString(), e);
}
} else {
try {
deny(clusterKey+ri, predicateKey, objectKey, puid, ouid);
} catch (DatabaseException e) {
- e.printStackTrace();
+ LOGGER.error("statement deny(clusterKey: {}, ri: {}, predicateKey: {}, objectKey: {}, puid: {}, ouid: {}) failed",
+ clusterKey, ri, predicateKey, objectKey, puid.toString(), ouid.toString(), e);
}
}
}
public void process() throws IllegalAcornStateException {
+ if (version == ClusterChange.VERSION) {
+ process1();
+ } else if (version == ClusterChange2.VERSION) {
+ process2();
+ }
+ }
+
+ private void process1() throws IllegalAcornStateException {
foreignPos = 0;
}
}
-
-
+
+ private void process2() throws IllegalAcornStateException {
+
+ while(pos < len) {
+
+ int op = bytes[pos++]&0xff;
+
+ switch(op) {
+
+ case ClusterChange2.SET_IMMUTABLE_OPERATION:
+ processSetImmutable(op);
+ break;
+ case ClusterChange2.UNDO_VALUE_OPERATION:
+ processUndoValue(op);
+ break;
+ case ClusterChange2.SET_DELETED_OPERATION:
+ // TODO: implement?
+ break;
+ default:
+ throw new IllegalAcornStateException("Can not process operation " + op + " for cluster " + uid);
+
+ }
+ }
+
+ }
+
+ private void processSetImmutable(int op) {
+ int value = bytes[pos++]&0xff;
+ setImmutable(value > 0);
+ }
+
+ private void processUndoValue(int op) {
+ Bytes.readLE4(bytes, pos);
+ pos+=4;
+ }
+
abstract void create() throws DatabaseException;
abstract void delete(int resourceIndex) throws DatabaseException;
abstract void modify(int resourceKey, long offset, int size, byte[] bytes, int pos) throws DatabaseException;
abstract void set(int resourceKey, byte[] bytes, int length) throws DatabaseException;
-
+
abstract void claim(int resourceKey, int predicateKey, int objectKey, ClusterUID puid, ClusterUID ouid) throws DatabaseException;
abstract void deny(int resourceKey, int predicateKey, int objectKey, ClusterUID puid, ClusterUID ouid) throws DatabaseException;
-
+
+ abstract void setImmutable(boolean value);
+
}
+++ /dev/null
-package org.simantics.acorn.internal;
-
-import org.simantics.acorn.exception.IllegalAcornStateException;
-import org.simantics.db.service.Bytes;
-import org.simantics.db.service.ClusterUID;
-
-public abstract class ClusterUpdateProcessorBase2 {
-
- final private byte[] bytes;
- private int pos = 0;
- final private int len;
- final private ClusterUID uid;
-
- public ClusterUpdateProcessorBase2(byte[] operations) {
- this.bytes = operations;
- this.len = Bytes.readLE4(bytes, 0) + 4; // whatta?
- int version = Bytes.readLE4(bytes, 4);
- assert(version == ClusterChange2.VERSION);
- long cuid1 = Bytes.readLE8(bytes, 8);
- long cuid2 = Bytes.readLE8(bytes, 16);
- pos = 24;
- uid = ClusterUID.make(cuid1, cuid2);
- }
-
- public ClusterUID getClusterUID() {
- return uid;
- }
-
- private void processSetImmutable(int op) {
- int value = bytes[pos++]&0xff;
- setImmutable(value > 0);
- }
-
- private void processUndoValue(int op) {
- Bytes.readLE4(bytes, pos);
- pos+=4;
- }
-
- public void process() throws IllegalAcornStateException {
-
- while(pos < len) {
-
- int op = bytes[pos++]&0xff;
-
- switch(op) {
-
- case ClusterChange2.SET_IMMUTABLE_OPERATION:
- processSetImmutable(op);
- break;
- case ClusterChange2.UNDO_VALUE_OPERATION:
- processUndoValue(op);
- break;
- default:
- throw new IllegalAcornStateException("Can not process cluster " + uid);
-
- }
- }
- }
-
- abstract void setImmutable(boolean value);
-
-}
import org.simantics.acorn.lru.ClusterChangeSet.Type;
import org.simantics.db.exception.DatabaseException;
import org.simantics.db.service.ClusterUID;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
public class UndoClusterUpdateProcessor extends ClusterUpdateProcessorBase {
-
+
+ private static final Logger LOGGER = LoggerFactory.getLogger(UndoClusterUpdateProcessor.class);
+
public final static boolean DEBUG = false;
final private ClusterChangeSet ccs;
-
+
private int oldValuesIndex = 0;
private int statementMaskIndex = 0;
-
- final public List<Entry> entries = new ArrayList<Entry>();
+
+ final public List<Entry> entries = new ArrayList<>();
public UndoClusterUpdateProcessor(ClusterManager client, ClusterStreamChunk chunk, ClusterChangeSet ccs) throws DatabaseException {
super(client, readOperation(client, chunk, ccs));
}
private static byte[] readOperation(ClusterManager manager, ClusterStreamChunk chunk, ClusterChangeSet ccs) throws AcornAccessVerificationException, IllegalAcornStateException {
-
-// ClusterStreamChunk chunk;
-// manager.streamLRU.acquireMutex();
-// try {
-// chunk = ccs.getChunk(manager);
-// } catch (Throwable t) {
-// throw new IllegalStateException(t);
-// } finally {
-// manager.streamLRU.releaseMutex();
-// }
-//
-// chunk.acquireMutex();
-// try {
-// chunk.ve
- chunk.makeResident();
- return chunk.getOperation(ccs.chunkOffset);
-// } catch (Throwable t) {
-// throw new IllegalStateException(t);
-// } finally {
-// chunk.releaseMutex();
-// }
+ chunk.makeResident();
+ return chunk.getOperation(ccs.chunkOffset);
}
@Override
}
}
-
+
+ @Override
+ void setImmutable(boolean value) {
+ LOGGER.error("Attempted to undo `setImmutable({})` cluster operation for cluster {} which is not supported.", value, ccs.cuid);
+ }
+
}
import org.simantics.acorn.Persistable;
import org.simantics.acorn.exception.AcornAccessVerificationException;
import org.simantics.acorn.exception.IllegalAcornStateException;
-import org.simantics.acorn.internal.ClusterChange;
import org.simantics.acorn.internal.UndoClusterUpdateProcessor;
import org.simantics.compressions.CompressionCodec;
import org.simantics.compressions.Compressions;
if(op.ccs == null) throw new IllegalAcornStateException("Cluster ChangeSet " + ccsId + " was not found.");
UndoClusterUpdateProcessor proc = new UndoClusterUpdateProcessor(clusters, this, op.ccs);
- if(proc.version != ClusterChange.VERSION)
- return null;
// This cluster and CCS can still be under preparation => wait
clusters.clusterLRU.ensureUpdates(proc.getClusterUID());
import org.simantics.acorn.cluster.ClusterImpl;
import org.simantics.acorn.exception.AcornAccessVerificationException;
import org.simantics.acorn.exception.IllegalAcornStateException;
-import org.simantics.acorn.internal.ClusterChange;
-import org.simantics.acorn.internal.ClusterChange2;
import org.simantics.acorn.internal.ClusterUpdateProcessor;
-import org.simantics.acorn.internal.ClusterUpdateProcessor2;
import org.simantics.db.service.Bytes;
import org.simantics.db.service.ClusterUID;
}
public void runWithData(byte[] data) throws IllegalAcornStateException, AcornAccessVerificationException {
-
try {
- int version = Bytes.readLE4(data, 4);
- if(version == ClusterChange.VERSION) {
- ClusterUpdateProcessor processor = new ClusterUpdateProcessor(manager, manager.support, data, this);
- ClusterImpl cluster = info.getForUpdate();
- cluster = processor.process(cluster);
- manager.update(uid, cluster);
- } else if (version == ClusterChange2.VERSION) {
- ClusterUpdateProcessor2 processor = new ClusterUpdateProcessor2(manager.support, data, this);
- ClusterImpl cluster = info.getForUpdate();
- processor.process(cluster);
- manager.update(uid, cluster);
- } else {
- throw new IllegalAcornStateException("unsupported clusterChange version " + version);
- }
+ ClusterUpdateProcessor processor = new ClusterUpdateProcessor(manager, manager.support, data, this);
+ ClusterImpl cluster = info.getForUpdate();
+ cluster = processor.process(cluster);
+ manager.update(uid, cluster);
} catch (IllegalAcornStateException | AcornAccessVerificationException e) {
throw e;
} catch (Throwable t) {
+++ /dev/null
-package org.simantics.db.layer0;
-
-import java.util.Map;
-
-import org.simantics.simulator.variable.exceptions.NodeManagerException;
-
-public interface StandardEngine<Node> {
-
- Object getValue(Node node) throws NodeManagerException;
- void setValue(Node node, Object value) throws NodeManagerException;
- String getName(Node node);
- Map<String,Node> getChildren(Node node);
- Map<String,Node> getProperties(Node node);
-
-}
+++ /dev/null
-package org.simantics.db.layer0;
-
-public interface StandardNode {
-
-}
import org.simantics.db.service.Bytes;
import org.simantics.db.service.ClusterUID;
-class ClusterChange2 {
+public class ClusterChange2 {
public static final int VERSION = 2;
public static final byte SET_IMMUTABLE_OPERATION = 1; // <byte : 0 = false>
public static final byte UNDO_VALUE_OPERATION = 2; // <int : resource index>
--- /dev/null
+<?xml version="1.0" encoding="UTF-8"?>
+<classpath>
+ <classpathentry kind="con" path="org.eclipse.jdt.launching.JRE_CONTAINER/org.eclipse.jdt.internal.debug.ui.launcher.StandardVMType/JavaSE-1.8"/>
+ <classpathentry kind="con" path="org.eclipse.pde.core.requiredPlugins"/>
+ <classpathentry kind="src" path="src"/>
+ <classpathentry kind="output" path="bin"/>
+</classpath>
--- /dev/null
+<?xml version="1.0" encoding="UTF-8"?>
+<projectDescription>
+ <name>org.simantics.logging.ui</name>
+ <comment></comment>
+ <projects>
+ </projects>
+ <buildSpec>
+ <buildCommand>
+ <name>org.eclipse.jdt.core.javabuilder</name>
+ <arguments>
+ </arguments>
+ </buildCommand>
+ <buildCommand>
+ <name>org.eclipse.pde.ManifestBuilder</name>
+ <arguments>
+ </arguments>
+ </buildCommand>
+ <buildCommand>
+ <name>org.eclipse.pde.SchemaBuilder</name>
+ <arguments>
+ </arguments>
+ </buildCommand>
+ </buildSpec>
+ <natures>
+ <nature>org.eclipse.pde.PluginNature</nature>
+ <nature>org.eclipse.jdt.core.javanature</nature>
+ </natures>
+</projectDescription>
--- /dev/null
+eclipse.preferences.version=1
+org.eclipse.jdt.core.compiler.codegen.inlineJsrBytecode=enabled
+org.eclipse.jdt.core.compiler.codegen.targetPlatform=1.8
+org.eclipse.jdt.core.compiler.compliance=1.8
+org.eclipse.jdt.core.compiler.problem.assertIdentifier=error
+org.eclipse.jdt.core.compiler.problem.enumIdentifier=error
+org.eclipse.jdt.core.compiler.source=1.8
--- /dev/null
+Manifest-Version: 1.0
+Bundle-ManifestVersion: 2
+Bundle-Name: Ui
+Bundle-SymbolicName: org.simantics.logging.ui;singleton:=true
+Bundle-Version: 1.0.0.qualifier
+Bundle-Activator: org.simantics.logging.ui.Activator
+Require-Bundle: javax.inject,
+ org.eclipse.osgi,
+ org.eclipse.jface,
+ org.eclipse.e4.ui.services,
+ org.eclipse.e4.core.di.annotations,
+ org.eclipse.core.runtime,
+ org.eclipse.ui.ide,
+ org.slf4j.api,
+ org.simantics.logging,
+ org.simantics.utils,
+ org.simantics.utils.ui
+Bundle-RequiredExecutionEnvironment: JavaSE-1.8
+Import-Package: javax.annotation;version="1.2.0"
+Bundle-ActivationPolicy: lazy
--- /dev/null
+source.. = src/
+output.. = bin/
+bin.includes = plugin.xml,\
+ META-INF/,\
+ .,\
+ icons/,\
+ fragment.e4xmi
--- /dev/null
+<?xml version="1.0" encoding="ASCII"?>
+<fragment:ModelFragments xmi:version="2.0" xmlns:xmi="http://www.omg.org/XMI" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:commands="http://www.eclipse.org/ui/2010/UIModel/application/commands" xmlns:fragment="http://www.eclipse.org/ui/2010/UIModel/fragment" xmlns:menu="http://www.eclipse.org/ui/2010/UIModel/application/ui/menu" xmi:id="_BxaXACerEeWxCPrV0pAZQQ">
+ <fragments xsi:type="fragment:StringModelFragment" xmi:id="_QqSikIrOEeW7h_qdP9N9fw" featurename="commands" parentElementId="xpath:/">
+ <elements xsi:type="commands:Command" xmi:id="_UCYfwIrOEeW7h_qdP9N9fw" elementId="org.simantics.logging.ui.command.saveLogFiles" commandName="Save Logs" description="Save all important application log files to a ZIP-archive"/>
+ <elements xsi:type="commands:Command" xmi:id="_34UCQB3xEeiTyNdCNKIG-w" elementId="org.simantics.logging.ui.command.selectLogLevel" commandName="Select Logging Level" description="Select current logging level">
+ <parameters xmi:id="_YFD2kB3yEeiTyNdCNKIG-w" elementId="org.simantics.logging.ui.commandparameter.selectLoggingLevel" name="Logging Level" optional="false"/>
+ </elements>
+ </fragments>
+ <fragments xsi:type="fragment:StringModelFragment" xmi:id="_fW12kIrOEeW7h_qdP9N9fw" featurename="handlers" parentElementId="xpath:/">
+ <elements xsi:type="commands:Handler" xmi:id="_k2L0IIrOEeW7h_qdP9N9fw" elementId="org.simantics.logging.ui.handlers.saveLogFiles" contributionURI="bundleclass://org.simantics.logging.ui/org.simantics.logging.ui.handlers.SaveLogFilesHandler" command="_UCYfwIrOEeW7h_qdP9N9fw"/>
+ <elements xsi:type="commands:Handler" xmi:id="_60CMgB3xEeiTyNdCNKIG-w" elementId="org.simantics.logging.ui.handlers.selectLogLevel" contributionURI="bundleclass://org.simantics.logging.ui/org.simantics.logging.ui.handlers.SelectLoggingLevelHandler" command="_34UCQB3xEeiTyNdCNKIG-w"/>
+ </fragments>
+ <fragments xsi:type="fragment:StringModelFragment" xmi:id="_pVgfIIrOEeW7h_qdP9N9fw" featurename="menuContributions" parentElementId="xpath:/">
+ <elements xsi:type="menu:MenuContribution" xmi:id="_tSwX0IrOEeW7h_qdP9N9fw" elementId="org.simantics.logging.ui.menucontribution.saveLogFiles" positionInParent="after=group.main.ext" parentId="help">
+ <children xsi:type="menu:HandledMenuItem" xmi:id="_2LM_MIrOEeW7h_qdP9N9fw" elementId="org.simantics.logging.ui.saveLogFiles.handledmenuitem" label="Save Logs" iconURI="platform:/plugin/org.simantics.logging.ui/icons/page_white_compressed.png" tooltip="Save all important application log files to a ZIP-archive" command="_UCYfwIrOEeW7h_qdP9N9fw"/>
+ </elements>
+ <elements xsi:type="menu:MenuContribution" xmi:id="_fGQd4B3xEeiTyNdCNKIG-w" elementId="org.simantics.logging.ui.menucontribution.selectLoggingLevelMenu" positionInParent="after=group.main.ext" parentId="help">
+ <children xsi:type="menu:Menu" xmi:id="_jtqG0B3xEeiTyNdCNKIG-w" elementId="org.simantics.logging.ui.menucontribution.selectLoggingLevelSubMenu" label="Logging Level.." iconURI="platform:/plugin/org.simantics.logging.ui/icons/page_white_edit.png">
+ <children xsi:type="menu:HandledMenuItem" xmi:id="_PVgsgB3yEeiTyNdCNKIG-w" elementId="org.simantics.logging.ui.handledmenuitem.selectLoggingLevel.trace" label="TRACE" iconURI="platform:/plugin/org.simantics.logging.ui/icons/text_align_left.png" type="Radio" command="_34UCQB3xEeiTyNdCNKIG-w">
+ <parameters xmi:id="_XQtogB3yEeiTyNdCNKIG-w" elementId="org.simantics.logging.ui.parameter.loggingLevel.trace" name="org.simantics.logging.ui.commandparameter.selectLoggingLevel" value="TRACE"/>
+ </children>
+ <children xsi:type="menu:HandledMenuItem" xmi:id="_5bsEEB3yEeiTyNdCNKIG-w" elementId="org.simantics.logging.ui.handledmenuitem.selectLoggingLevel.debug" label="DEBUG" iconURI="platform:/plugin/org.simantics.logging.ui/icons/bug.png" type="Radio" command="_34UCQB3xEeiTyNdCNKIG-w">
+ <parameters xmi:id="_5bsEER3yEeiTyNdCNKIG-w" elementId="org.simantics.logging.ui.parameter.loggingLevel.debug" name="org.simantics.logging.ui.commandparameter.selectLoggingLevel" value="DEBUG"/>
+ </children>
+ <children xsi:type="menu:HandledMenuItem" xmi:id="_5fIZoB3yEeiTyNdCNKIG-w" elementId="org.simantics.logging.ui.handledmenuitem.selectLoggingLevel.info" label="INFO" iconURI="platform:/plugin/org.simantics.logging.ui/icons/information.png" selected="true" type="Radio" command="_34UCQB3xEeiTyNdCNKIG-w">
+ <parameters xmi:id="_5fIZoR3yEeiTyNdCNKIG-w" elementId="org.simantics.logging.ui.parameter.loggingLevel.info" name="org.simantics.logging.ui.commandparameter.selectLoggingLevel" value="INFO"/>
+ </children>
+ <children xsi:type="menu:HandledMenuItem" xmi:id="_5ihE0B3yEeiTyNdCNKIG-w" elementId="org.simantics.logging.ui.handledmenuitem.selectLoggingLevel.warn" label="WARN" iconURI="platform:/plugin/org.simantics.logging.ui/icons/warning.png" type="Radio" command="_34UCQB3xEeiTyNdCNKIG-w">
+ <parameters xmi:id="_5ihE0R3yEeiTyNdCNKIG-w" elementId="org.simantics.logging.ui.parameter.loggingLevel.warn" name="org.simantics.logging.ui.commandparameter.selectLoggingLevel" value="WARN"/>
+ </children>
+ <children xsi:type="menu:HandledMenuItem" xmi:id="_55I1EB3yEeiTyNdCNKIG-w" elementId="org.simantics.logging.ui.handledmenuitem.selectLoggingLevel.error" label="ERROR" iconURI="platform:/plugin/org.simantics.logging.ui/icons/error.png" type="Radio" command="_34UCQB3xEeiTyNdCNKIG-w">
+ <parameters xmi:id="_55I1ER3yEeiTyNdCNKIG-w" elementId="org.simantics.logging.ui.parameter.loggingLevel.error" name="org.simantics.logging.ui.commandparameter.selectLoggingLevel" value="ERROR"/>
+ </children>
+ </children>
+ </elements>
+ </fragments>
+</fragment:ModelFragments>
--- /dev/null
+<?xml version="1.0" encoding="UTF-8"?>
+<?eclipse version="3.4"?>
+<plugin>
+
+ <extension
+ id="org.simantics.logging.ui.fragment"
+ point="org.eclipse.e4.workbench.model">
+ <fragment
+ apply="initial"
+ uri="fragment.e4xmi">
+ </fragment>
+ </extension>
+
+</plugin>
--- /dev/null
+package org.simantics.logging.ui;
+
+import org.osgi.framework.BundleActivator;
+import org.osgi.framework.BundleContext;
+
+public class Activator implements BundleActivator {
+
+ private static BundleContext context;
+
+ static BundleContext getContext() {
+ return context;
+ }
+
+ /*
+ * (non-Javadoc)
+ * @see org.osgi.framework.BundleActivator#start(org.osgi.framework.BundleContext)
+ */
+ public void start(BundleContext bundleContext) throws Exception {
+ Activator.context = bundleContext;
+ }
+
+ /*
+ * (non-Javadoc)
+ * @see org.osgi.framework.BundleActivator#stop(org.osgi.framework.BundleContext)
+ */
+ public void stop(BundleContext bundleContext) throws Exception {
+ Activator.context = null;
+ }
+
+}
--- /dev/null
+package org.simantics.logging.ui.handlers;
+
+import java.io.IOException;
+import java.nio.file.Files;
+import java.nio.file.Path;
+import java.nio.file.Paths;
+import java.time.LocalDateTime;
+import java.time.format.DateTimeFormatter;
+import java.util.List;
+import java.util.Map;
+import java.util.Map.Entry;
+
+import javax.inject.Named;
+
+import org.eclipse.core.runtime.Platform;
+import org.eclipse.e4.core.di.annotations.Execute;
+import org.eclipse.e4.ui.services.IServiceConstants;
+import org.eclipse.swt.SWT;
+import org.eclipse.swt.widgets.FileDialog;
+import org.eclipse.swt.widgets.Shell;
+import org.simantics.logging.LogCollector;
+import org.simantics.utils.FileUtils;
+import org.simantics.utils.ui.ExceptionUtils;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+public class SaveLogFilesHandler {
+
+ private static final Logger LOGGER = LoggerFactory.getLogger(SaveLogFilesHandler.class);
+
+ private static final String[] FILTER_NAMES = { "ZIP-archive", "AllFiles (*:*)" };
+ private static final String[] FILTER_EXTENSIONS = { "*.zip", "*.*" };
+ private static final String USER_HOME = System.getProperty("user.home");
+
+ @Execute
+ public void execute(@Named(IServiceConstants.ACTIVE_SHELL) Shell shell) {
+
+ FileDialog dialog = new FileDialog(shell, SWT.SAVE);
+ dialog.setFilterNames(FILTER_NAMES);
+ dialog.setFilterExtensions(FILTER_EXTENSIONS);
+ if (USER_HOME != null) {
+ if (Files.exists(Paths.get(USER_HOME))) {
+ dialog.setFilterPath(USER_HOME);
+ }
+ }
+ StringBuilder fileName = new StringBuilder();
+ String productName = Platform.getProduct().getName();
+ if (productName != null)
+ fileName.append(productName.replaceAll(" ", "_")).append("-");
+
+ fileName.append("logs-").append(currentLocalDateTimeStamp());
+ String actualFileName = fileName.toString();
+ if (LOGGER.isDebugEnabled())
+ LOGGER.debug("Resolved log files name {}", actualFileName);
+ dialog.setFileName(actualFileName);
+
+ String destination = dialog.open();
+ if (destination != null) {
+ if (LOGGER.isDebugEnabled())
+ LOGGER.debug("Destination for saving log files is {}", destination);
+
+ try {
+ Path tempDir = Files.createTempDirectory(actualFileName);
+ Map<String, List<Path>> allLogs = LogCollector.allLogs();
+ for (Entry<String, List<Path>> logEntry : allLogs.entrySet()) {
+ Path subFolder = tempDir.resolve(logEntry.getKey());
+ Files.createDirectory(subFolder);
+ for (Path p : logEntry.getValue()) {
+ try {
+ Files.copy(p, subFolder.resolve(p.getFileName()));
+ } catch (IOException e) {
+ LOGGER.error("Could not copy {}", p.toAbsolutePath(), e);
+ }
+ }
+ }
+ FileUtils.compressZip(tempDir.toAbsolutePath().toString(), destination);
+ FileUtils.delete(tempDir);
+ } catch (Throwable t) {
+ LOGGER.error("Could not save log files to ZIP", t);
+ ExceptionUtils.logAndShowError("Could not save log files to ZIP", t);
+ }
+ } else {
+ if (LOGGER.isDebugEnabled()) {
+ LOGGER.debug("No destination selected for saving logs");
+ }
+ }
+ }
+
+ private static String currentLocalDateTimeStamp() {
+ return LocalDateTime.now().format(DateTimeFormatter.ofPattern("yyyy-MM-dd_HHmm"));
+ }
+
+}
--- /dev/null
+package org.simantics.logging.ui.handlers;
+
+import javax.inject.Named;
+
+import org.eclipse.e4.core.di.annotations.Execute;
+import org.simantics.logging.LogConfigurator;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+public class SelectLoggingLevelHandler {
+
+ private static final Logger LOGGER = LoggerFactory.getLogger(SelectLoggingLevelHandler.class);
+
+ @Execute
+ public void execute(@Named("org.simantics.logging.ui.commandparameter.selectLoggingLevel") String level) {
+ if (LOGGER.isDebugEnabled())
+ LOGGER.debug("Setting logging level to {}", level);
+ LogConfigurator.setLoggingLevel(level);
+ }
+
+}
--- /dev/null
+<?xml version="1.0" encoding="UTF-8"?>
+<classpath>
+ <classpathentry kind="con" path="org.eclipse.jdt.launching.JRE_CONTAINER/org.eclipse.jdt.internal.debug.ui.launcher.StandardVMType/JavaSE-1.8"/>
+ <classpathentry kind="con" path="org.eclipse.pde.core.requiredPlugins"/>
+ <classpathentry kind="src" path="src"/>
+ <classpathentry kind="output" path="bin"/>
+</classpath>
--- /dev/null
+<?xml version="1.0" encoding="UTF-8"?>
+<projectDescription>
+ <name>org.simantics.logging</name>
+ <comment></comment>
+ <projects>
+ </projects>
+ <buildSpec>
+ <buildCommand>
+ <name>org.eclipse.jdt.core.javabuilder</name>
+ <arguments>
+ </arguments>
+ </buildCommand>
+ <buildCommand>
+ <name>org.eclipse.pde.ManifestBuilder</name>
+ <arguments>
+ </arguments>
+ </buildCommand>
+ <buildCommand>
+ <name>org.eclipse.pde.SchemaBuilder</name>
+ <arguments>
+ </arguments>
+ </buildCommand>
+ <buildCommand>
+ <name>org.eclipse.pde.ds.core.builder</name>
+ <arguments>
+ </arguments>
+ </buildCommand>
+ </buildSpec>
+ <natures>
+ <nature>org.eclipse.pde.PluginNature</nature>
+ <nature>org.eclipse.jdt.core.javanature</nature>
+ </natures>
+</projectDescription>
--- /dev/null
+eclipse.preferences.version=1
+org.eclipse.jdt.core.compiler.codegen.inlineJsrBytecode=enabled
+org.eclipse.jdt.core.compiler.codegen.targetPlatform=1.8
+org.eclipse.jdt.core.compiler.compliance=1.8
+org.eclipse.jdt.core.compiler.problem.assertIdentifier=error
+org.eclipse.jdt.core.compiler.problem.enumIdentifier=error
+org.eclipse.jdt.core.compiler.source=1.8
--- /dev/null
+Manifest-Version: 1.0
+Bundle-ManifestVersion: 2
+Bundle-Name: Simantics Logging Core
+Bundle-SymbolicName: org.simantics.logging
+Bundle-Version: 1.0.0.qualifier
+Bundle-Activator: org.simantics.logging.internal.Activator
+Bundle-Vendor: Semantum Oy
+Require-Bundle: org.eclipse.core.runtime,
+ org.slf4j.api,
+ ch.qos.logback.classic,
+ ch.qos.logback.core
+Bundle-RequiredExecutionEnvironment: JavaSE-1.8
+Bundle-ActivationPolicy: lazy
+Service-Component: logbackLogProvider.xml,
+ dbAndMetadataLogProvider.xml
+Export-Package: org.simantics.logging
--- /dev/null
+output.. = bin/
+bin.includes = META-INF/,\
+ .,\
+ scl/,\
+ logbackLogProvider.xml,\
+ dbAndMetadataLogProvider.xml
+source.. = src/
--- /dev/null
+<?xml version="1.0" encoding="UTF-8"?>
+<scr:component xmlns:scr="http://www.osgi.org/xmlns/scr/v1.1.0" name="org.simantics.logging.dbAndMetadataLogProvider">
+ <implementation class="org.simantics.logging.DBAndMetadataLogProvider"/>
+ <service>
+ <provide interface="org.simantics.logging.LogProvider"/>
+ </service>
+</scr:component>
--- /dev/null
+<?xml version="1.0" encoding="UTF-8"?>
+<scr:component xmlns:scr="http://www.osgi.org/xmlns/scr/v1.1.0" name="org.simantics.logging">
+ <implementation class="org.simantics.logging.LogbackLogProvider"/>
+ <service>
+ <provide interface="org.simantics.logging.LogProvider"/>
+ </service>
+</scr:component>
--- /dev/null
+import "Files"
+import "Map" as Map
+
+importJava "org.simantics.logging.LogConfigurator" where
+ setLoggingLevel :: String -> <Proc> ()
+ setLoggingLevelForLogger :: String -> String -> <Proc> ()
+
+importJava "org.simantics.logging.LogCollector" where
+ allLogs :: <Proc> Map.T String [Path]
\ No newline at end of file
--- /dev/null
+package org.simantics.logging;
+
+import java.lang.reflect.Field;
+import java.net.URL;
+import java.nio.file.Path;
+import java.nio.file.Paths;
+import java.util.ArrayList;
+import java.util.List;
+
+import org.eclipse.core.runtime.Platform;
+import org.osgi.framework.Bundle;
+import org.slf4j.LoggerFactory;
+
+public class DBAndMetadataLogProvider implements LogProvider {
+
+ private static final org.slf4j.Logger LOGGER = LoggerFactory.getLogger(DBAndMetadataLogProvider.class);
+
+ @Override
+ public List<Path> get() {
+ List<Path> logs = new ArrayList<>();
+ Path dbClientLog = getDBClientLogLocation();
+ if (dbClientLog != null)
+ logs.add(dbClientLog);
+ Path metadataLogLocation = getMetadataLogLocation();
+ if (metadataLogLocation != null)
+ logs.add(metadataLogLocation);
+ return logs;
+ }
+
+ private static Path getDBClientLogLocation() {
+ Bundle bundle = Platform.getBundle("org.simantics.db.common");
+ try {
+ Class<?> forName = bundle.loadClass("org.simantics.db.common.internal.config.InternalClientConfig");
+ Field field = forName.getField("DB_CLIENT_LOG_FILE");
+ String value = (String) field.get(null);
+ return Paths.get(value);
+ } catch (ClassNotFoundException | NoSuchFieldException | SecurityException | IllegalArgumentException | IllegalAccessException e) {
+ LOGGER.error("Could not read db-client.log location", e);
+ }
+ return null;
+ }
+
+ private static Path getMetadataLogLocation() {
+ String prop = System.getProperty("osgi.instance.area", null);
+ if (prop != null) {
+ try {
+ URL url = new URL(prop);
+ if ("file".equals(url .getProtocol())) {
+ Path path = Paths.get(url.toURI());
+ return path.resolve(".metadata").resolve(".log");
+ } else {
+ LOGGER.warn("Unsupported protocol {}", url);
+ }
+ } catch (Throwable t) {
+ LOGGER.error("Could not get .metadata/.log", t);
+ }
+ }
+ return null;
+ }
+}
--- /dev/null
+package org.simantics.logging;
+
+import java.nio.file.Path;
+import java.util.ArrayList;
+import java.util.Collection;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.List;
+import java.util.Map;
+
+import org.osgi.framework.InvalidSyntaxException;
+import org.osgi.framework.ServiceReference;
+import org.simantics.logging.internal.Activator;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+public final class LogCollector {
+
+ private static final Logger LOGGER = LoggerFactory.getLogger(LogCollector.class);
+
+ public static Map<String, List<Path>> allLogs() {
+ Map<String, List<Path>> results = new HashMap<>();
+ if (LOGGER.isDebugEnabled())
+ LOGGER.debug("Collecting all logs from declarative services");
+
+ Collection<LogProvider> logProviders = getLogProviders();
+ for (LogProvider logProvider : logProviders) {
+ List<Path> logs = logProvider.get();
+ String key = logProvider.getClass().getSimpleName();
+ Collection<Path> existing = results.get(key);
+ if (existing != null) {
+ LOGGER.info("Duplicate log providers with name {} exist, merging logs", key);
+ logs.addAll(existing);
+ }
+ results.put(key, logs);
+ }
+ if (LOGGER.isDebugEnabled())
+ LOGGER.debug("Found logs from {} providers", results.keySet());
+ return results;
+ }
+
+ private static List<LogProvider> getLogProviders() {
+ ServiceReference<?>[] serviceReferences = new ServiceReference<?>[0];
+ String key = LogProvider.class.getName();
+ try {
+ serviceReferences = Activator.getContext().getAllServiceReferences(key, null);
+ } catch (InvalidSyntaxException e) {
+ LOGGER.error("Could not get service references for {}!", key, e);
+ }
+ if (serviceReferences.length == 0) {
+ if (LOGGER.isDebugEnabled())
+ LOGGER.debug("No service references found for {}", key);
+ return Collections.emptyList();
+ }
+
+ List<LogProvider> logProviders = new ArrayList<>(serviceReferences.length);
+ for (ServiceReference<?> reference : serviceReferences) {
+ LogProvider logProvider = (LogProvider) Activator.getContext().getService(reference);
+ logProviders.add(logProvider);
+ }
+ if (LOGGER.isDebugEnabled())
+ LOGGER.debug("Found {} log providers", logProviders);
+ return logProviders;
+ }
+}
--- /dev/null
+package org.simantics.logging;
+
+import java.util.List;
+
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+import ch.qos.logback.classic.Level;
+import ch.qos.logback.classic.LoggerContext;
+
+/**
+ * Class for modifying the active logging configuration
+ *
+ * @author Jani Simomaa
+ *
+ */
+public final class LogConfigurator {
+
+ private static final Logger LOGGER = LoggerFactory.getLogger(LogConfigurator.class);
+
+ private LogConfigurator() {
+ }
+
+ /**
+ * Sets logging level to represent the given argument
+ *
+ * @param level ERROR WARN INFO DEBUG TRACE
+ */
+ public static void setLoggingLevel(String level) {
+ if (LOGGER.isInfoEnabled())
+ LOGGER.info("Setting logger level to {}", level);
+ LoggerContext context = (LoggerContext) LoggerFactory.getILoggerFactory();
+ Level ll = getLoggerLevel(level);
+ List<ch.qos.logback.classic.Logger> loggerList = context.getLoggerList();
+ loggerList.forEach(l -> l.setLevel(ll));
+ if (LOGGER.isDebugEnabled())
+ LOGGER.debug("Loggers installed {}", loggerList);
+ }
+
+ public static void setLoggingLevelForLogger(String logger, String level) {
+ if (LOGGER.isInfoEnabled())
+ LOGGER.info("Setting logger level to {} for loggers {}", level, logger);
+ LoggerContext context = (LoggerContext) LoggerFactory.getILoggerFactory();
+ Level ll = getLoggerLevel(level);
+ ch.qos.logback.classic.Logger loggerList = context.getLogger(logger);
+ loggerList.setLevel(ll);
+ if (LOGGER.isDebugEnabled())
+ LOGGER.debug("Loggers installed {}", loggerList);
+ }
+
+ private static Level getLoggerLevel(String level) {
+ return Level.valueOf(level);
+ }
+}
--- /dev/null
+package org.simantics.logging;
+
+import java.nio.file.Path;
+import java.util.List;
+import java.util.function.Supplier;
+
+public interface LogProvider extends Supplier<List<Path>> {
+
+}
--- /dev/null
+package org.simantics.logging;
+
+import java.nio.file.Files;
+import java.nio.file.Path;
+import java.nio.file.Paths;
+import java.util.ArrayList;
+import java.util.Iterator;
+import java.util.List;
+import java.util.stream.Collectors;
+
+import org.slf4j.LoggerFactory;
+
+import ch.qos.logback.classic.Logger;
+import ch.qos.logback.classic.LoggerContext;
+import ch.qos.logback.classic.spi.ILoggingEvent;
+import ch.qos.logback.core.Appender;
+import ch.qos.logback.core.FileAppender;
+import ch.qos.logback.core.rolling.RollingFileAppender;
+import ch.qos.logback.core.spi.AppenderAttachable;
+
+public class LogbackLogProvider implements LogProvider {
+
+ private static final org.slf4j.Logger LOGGER = LoggerFactory.getLogger(LogbackLogProvider.class);
+
+ @Override
+ public List<Path> get() {
+ List<Path> logs = new ArrayList<>();
+ try {
+ LoggerContext context = (LoggerContext) LoggerFactory.getILoggerFactory();
+ Logger logger = context.getLogger(Logger.ROOT_LOGGER_NAME);
+ Iterator<Appender<ILoggingEvent>> appenderIter = logger.iteratorForAppenders();
+ while (appenderIter.hasNext()) {
+ FileAppender<ILoggingEvent> appender = findFileAppender(appenderIter.next());
+ if (appender != null) {
+ String logFile = ((FileAppender<ILoggingEvent>)appender).getFile();
+ Path log = Paths.get(logFile).toAbsolutePath();
+ if (appender instanceof RollingFileAppender) {
+ // Collect all logs
+ Path parent = log.getParent();
+ List<Path> newLogs = Files.walk(parent).collect(Collectors.toList());
+ if (LOGGER.isDebugEnabled())
+ LOGGER.debug("Found {} from {}", newLogs, appender);
+ logs.addAll(newLogs);
+ } else {
+ logs.add(log);
+ }
+ } else {
+ if (LOGGER.isDebugEnabled()) {
+ LOGGER.debug("Appender is not {} but is {} instead", FileAppender.class.getName(), appender != null ? appender.getClass().getName() : "null");
+ }
+ }
+ }
+ } catch (ClassCastException e) {
+ // Okay, we are not using logback here
+ if (LOGGER.isDebugEnabled())
+ LOGGER.debug("Seems like we are not using logback but {} instead", LoggerFactory.getILoggerFactory(), e);
+ } catch (Throwable t) {
+ LOGGER.error("Could not collect logs", t);
+ }
+ if (LOGGER.isDebugEnabled())
+ LOGGER.debug("Found {} log files : {}", logs.size(), logs);
+ return logs;
+ }
+
+ private static FileAppender<ILoggingEvent> findFileAppender(Appender<ILoggingEvent> appender) {
+ if (appender instanceof AppenderAttachable) {
+ // Ok, has child appender
+ Iterator<Appender<ILoggingEvent>> children = ((AppenderAttachable<ILoggingEvent>) appender).iteratorForAppenders();
+ while (children.hasNext()) {
+ FileAppender<ILoggingEvent> app = findFileAppender(children.next());
+ // TODO: returns only first FileAppender that it founds but not a collection
+ if (app != null)
+ return app;
+ }
+ return null;
+ } else if (appender instanceof FileAppender) {
+ return (FileAppender<ILoggingEvent>) appender;
+ } else {
+ return null;
+ }
+ }
+
+}
--- /dev/null
+package org.simantics.logging.internal;
+
+import org.osgi.framework.BundleActivator;
+import org.osgi.framework.BundleContext;
+
+public class Activator implements BundleActivator {
+
+ private static BundleContext context;
+
+ public static BundleContext getContext() {
+ return context;
+ }
+
+ /*
+ * (non-Javadoc)
+ * @see org.osgi.framework.BundleActivator#start(org.osgi.framework.BundleContext)
+ */
+ public void start(BundleContext bundleContext) throws Exception {
+ Activator.context = bundleContext;
+ }
+
+ /*
+ * (non-Javadoc)
+ * @see org.osgi.framework.BundleActivator#stop(org.osgi.framework.BundleContext)
+ */
+ public void stop(BundleContext bundleContext) throws Exception {
+ Activator.context = null;
+ }
+
+}
*/
public abstract class AbstractDatasource implements Datasource {
- protected ListenerList<DatasourceListener> listeners = new ListenerList<DatasourceListener>(DatasourceListener.class);
+ protected ListenerList<DatasourceListener> listeners = new ListenerList<>(DatasourceListener.class);
protected Lock readLock, writeLock;
public AbstractDatasource() {
listeners.remove(listener);
}
- protected void notifyStep() {
- for (final DatasourceListener l : listeners.getListeners()) {
+ protected void notifyStep(Datasource source) {
+ for (DatasourceListener l : listeners.getListeners()) {
if (l.getExecutor() == null) {
- l.onStep( AbstractDatasource.this );
+ l.onStep( source );
} else {
- l.getExecutor().execute(new Runnable() {
- public void run() {
- l.onStep(AbstractDatasource.this);
- }
- });
+ l.getExecutor().execute(() -> l.onStep(source));
}
}
}
+ protected void notifyStep() {
+ notifyStep(AbstractDatasource.this);
+ }
+
@Override
public Lock readLock() {
return readLock;
Object value = null;
if (handle != null) {
try {
- value = handle.getValue();
+ value = handle.getValue(source);
} catch (AccessorException e) {
if (failedIds.add(key))
logger.log(Level.SEVERE, e.toString(), e);
import java.util.Random;
import org.simantics.databoard.Datatypes;
+import org.simantics.databoard.accessor.error.AccessorException;
import org.simantics.databoard.binding.Binding;
import org.simantics.databoard.binding.NumberBinding;
import org.simantics.databoard.binding.error.BindingException;
public Object getValue() {
return PseudoSolver.this.getValue(key, b);
}
+
+ @Override
+ public Object getValue(Datasource datasource) throws AccessorException {
+ return PseudoSolver.this.getValue(key, b);
+ }
@Override
public void dispose() {
*/
Object getValue() throws AccessorException;
+ /**
+ * @return current value associated with this handle within the given Datasource
+ * @throws AccessorException if value cannot be retrieved
+ */
+ Object getValue(Datasource datasource) throws AccessorException;
+
/**
* Frees any resource related to this handle.
*/
package org.simantics.simulation.experiment;
public enum ExperimentState {
- INITIALIZING, RUNNING, STOPPED, DISPOSED
+ INITIALIZING, RUNNING, STOPPED, TO_BE_DISPOSED, DISPOSED
}
--- /dev/null
+<?xml version="1.0" encoding="UTF-8"?>\r
+<classpath>\r
+ <classpathentry kind="con" path="org.eclipse.jdt.launching.JRE_CONTAINER/org.eclipse.jdt.internal.debug.ui.launcher.StandardVMType/JavaSE-1.8"/>\r
+ <classpathentry kind="con" path="org.eclipse.pde.core.requiredPlugins"/>\r
+ <classpathentry kind="src" path="src"/>\r
+ <classpathentry kind="output" path="bin"/>\r
+</classpath>\r
--- /dev/null
+<?xml version="1.0" encoding="UTF-8"?>\r
+<projectDescription>\r
+ <name>org.simantics.simulator.toolkit.db</name>\r
+ <comment></comment>\r
+ <projects>\r
+ </projects>\r
+ <buildSpec>\r
+ <buildCommand>\r
+ <name>org.eclipse.jdt.core.javabuilder</name>\r
+ <arguments>\r
+ </arguments>\r
+ </buildCommand>\r
+ <buildCommand>\r
+ <name>org.eclipse.pde.ManifestBuilder</name>\r
+ <arguments>\r
+ </arguments>\r
+ </buildCommand>\r
+ <buildCommand>\r
+ <name>org.eclipse.pde.SchemaBuilder</name>\r
+ <arguments>\r
+ </arguments>\r
+ </buildCommand>\r
+ </buildSpec>\r
+ <natures>\r
+ <nature>org.eclipse.pde.PluginNature</nature>\r
+ <nature>org.eclipse.jdt.core.javanature</nature>\r
+ </natures>\r
+</projectDescription>\r
--- /dev/null
+Manifest-Version: 1.0
+Bundle-ManifestVersion: 2
+Bundle-Name: Simulator Toolkit for DB
+Bundle-SymbolicName: org.simantics.simulator.toolkit.db
+Bundle-Version: 1.0.0.qualifier
+Bundle-RequiredExecutionEnvironment: JavaSE-1.8
+Require-Bundle: org.simantics.simulator.toolkit;bundle-version="1.0.0";visibility:=reexport,
+ org.simantics.db.layer0;bundle-version="1.1.0"
+Export-Package: org.simantics.simulator.toolkit.db
+Bundle-Vendor: Semantum Oy
--- /dev/null
+source.. = src/\r
+output.. = bin/\r
+bin.includes = META-INF/,\\r
+ .\r
--- /dev/null
+package org.simantics.simulator.toolkit.db;
+
+import org.simantics.db.ReadGraph;
+import org.simantics.db.common.request.ParametrizedPrimitiveRead;
+import org.simantics.db.exception.RuntimeDatabaseException;
+import org.simantics.db.procedure.Listener;
+
+/**
+ * @author Antti Villberg
+ * @since 1.34.0
+ */
+public class ExperimentStateExternalRead extends ParametrizedPrimitiveRead<Object, Integer> {
+
+ private int value = 0;
+ private Listener<Integer> listener = null;
+
+ public ExperimentStateExternalRead(Object experiment) {
+ super(experiment);
+ }
+
+ @Override
+ public void register(ReadGraph graph, Listener<Integer> procedure) {
+ procedure.execute(value);
+ if (procedure.isDisposed())
+ return;
+ if (listener != null)
+ throw new RuntimeDatabaseException("Internal error");
+ listener = procedure;
+ }
+
+ @Override
+ public void unregistered() {
+ listener = null;
+ }
+
+ public void fire() {
+ value++;
+ if (listener != null)
+ listener.execute(value);
+ }
+
+}
\ No newline at end of file
-package org.simantics.db.layer0;
+package org.simantics.simulator.toolkit.db;
import java.util.Collection;
import java.util.concurrent.ConcurrentHashMap;
import org.simantics.db.exception.DatabaseException;
import org.simantics.db.layer0.variable.NodeSupport;
import org.simantics.db.procedure.Listener;
+import org.simantics.simulator.toolkit.StandardNodeManagerSupport;
+import org.simantics.simulator.toolkit.StandardRealm;
-abstract public class StandardSessionManager<Node, Engine extends StandardEngine<Node>> {
+public abstract class StandardSessionManager<Node, Engine extends StandardNodeManagerSupport<Node>> {
private ConcurrentHashMap<String, Listener<StandardRealm<Node,Engine>>> realmListeners = new ConcurrentHashMap<>();
- private ConcurrentHashMap<String, StandardRealm<Node,Engine>> REALMS = new ConcurrentHashMap<String, StandardRealm<Node,Engine>>();
- private ConcurrentHashMap<String, NodeSupport<Node>> SUPPORTS = new ConcurrentHashMap<String, NodeSupport<Node>>();
+ private ConcurrentHashMap<String, StandardRealm<Node,Engine>> REALMS = new ConcurrentHashMap<>();
+ private ConcurrentHashMap<String, NodeSupport<Node>> SUPPORTS = new ConcurrentHashMap<>();
// Accessing Realms should be done over ParametrizedPrimitveRead for the
// case if a realm is destroyed and new one is created with the same id than
@Override
public void register(ReadGraph graph, Listener<StandardRealm<Node, Engine>> procedure) {
-
StandardRealm<Node, Engine> realm = REALMS.get(parameter);
if (realm == null) {
try {
e.printStackTrace();
}
}
-
+
if(procedure.isDisposed()) {
procedure.execute(realm);
return;
}
-
+
Listener<StandardRealm<Node,Engine>> existing = getOrDisposeListener(parameter);
assert(existing == null);
realmListeners.put(parameter, procedure);
return realm;
}
}
-
+
protected StandardSessionManager() {
}
-
+
private Listener<StandardRealm<Node,Engine>> getOrDisposeListener(String key) {
Listener<StandardRealm<Node,Engine>> listener = realmListeners.get(key);
if(listener != null) {
}
return null;
}
-
+
private void modifyRealms(String key, StandardRealm<Node,Engine> realm) {
if(realm != null) {
REALMS.put(key, realm);
public NodeSupport<Node> getOrCreateNodeSupport(ReadGraph graph, String id) throws DatabaseException {
synchronized(SUPPORTS) {
- NodeSupport<Node> result = SUPPORTS.get(id);
- if(result == null) {
- StandardRealm<Node,Engine> realm = getOrCreateRealm(graph, id);
- result = new NodeSupport<Node>(realm.getNodeManager());
- SUPPORTS.put(id, result);
- }
- return result;
+ NodeSupport<Node> result = SUPPORTS.get(id);
+ if(result == null) {
+ StandardRealm<Node,Engine> realm = getOrCreateRealm(graph, id);
+ result = new NodeSupport<Node>(realm.getNodeManager());
+ SUPPORTS.put(id, result);
+ }
+ return result;
}
}
-
+
public StandardRealm<Node,Engine> getOrCreateRealm(ReadGraph graph, String id) throws DatabaseException {
synchronized(REALMS) {
return graph.syncRequest(new RealmRequest(id));
}
}
-
+
protected abstract Engine createEngine(ReadGraph graph, String id) throws DatabaseException;
protected abstract StandardRealm<Node,Engine> createRealm(Engine engine, String id);
-
+
public void removeRealm(WriteGraph graph, String id) throws DatabaseException {
modifyRealms(id, null);
// remove listeners from this realm
if (support != null)
support.dispose();
}
-
+
public Collection<String> getRealms() {
return REALMS.keySet();
}
--- /dev/null
+<?xml version="1.0" encoding="UTF-8"?>\r
+<classpath>\r
+ <classpathentry kind="con" path="org.eclipse.jdt.launching.JRE_CONTAINER/org.eclipse.jdt.internal.debug.ui.launcher.StandardVMType/JavaSE-1.8"/>\r
+ <classpathentry kind="con" path="org.eclipse.pde.core.requiredPlugins"/>\r
+ <classpathentry kind="src" path="src"/>\r
+ <classpathentry kind="output" path="bin"/>\r
+</classpath>\r
--- /dev/null
+<?xml version="1.0" encoding="UTF-8"?>\r
+<projectDescription>\r
+ <name>org.simantics.simulator.toolkit</name>\r
+ <comment></comment>\r
+ <projects>\r
+ </projects>\r
+ <buildSpec>\r
+ <buildCommand>\r
+ <name>org.eclipse.jdt.core.javabuilder</name>\r
+ <arguments>\r
+ </arguments>\r
+ </buildCommand>\r
+ <buildCommand>\r
+ <name>org.eclipse.pde.ManifestBuilder</name>\r
+ <arguments>\r
+ </arguments>\r
+ </buildCommand>\r
+ <buildCommand>\r
+ <name>org.eclipse.pde.SchemaBuilder</name>\r
+ <arguments>\r
+ </arguments>\r
+ </buildCommand>\r
+ </buildSpec>\r
+ <natures>\r
+ <nature>org.eclipse.pde.PluginNature</nature>\r
+ <nature>org.eclipse.jdt.core.javanature</nature>\r
+ </natures>\r
+</projectDescription>\r
--- /dev/null
+Manifest-Version: 1.0
+Bundle-ManifestVersion: 2
+Bundle-Name: Local Simulator Toolkit
+Bundle-SymbolicName: org.simantics.simulator.toolkit
+Bundle-Version: 1.0.0.qualifier
+Bundle-RequiredExecutionEnvironment: JavaSE-1.8
+Require-Bundle: org.simantics.simulator.variable;bundle-version="1.0.0";visibility:=reexport,
+ org.simantics.simulator;bundle-version="1.0.0";visibility:=reexport,
+ gnu.trove3;bundle-version="3.0.3",
+ org.slf4j.api;bundle-version="1.7.25",
+ org.simantics.simulation.sequences;bundle-version="1.0.0",
+ org.eclipse.core.runtime,
+ org.simantics.databoard;bundle-version="0.6.6",
+ org.simantics.scl.osgi;bundle-version="1.0.4"
+Export-Package: org.simantics.simulator.toolkit
--- /dev/null
+source.. = src/\r
+output.. = bin/\r
+bin.includes = META-INF/,\\r
+ .\r
--- /dev/null
+package org.simantics.simulator.toolkit;
+
+import org.simantics.databoard.binding.Binding;
+import org.simantics.simulation.sequences.action.AbstractActionContext;
+import org.simantics.simulator.IDynamicExperimentLocal;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+public class DynamicExperimentActionContext extends AbstractActionContext {
+
+ private static final Logger LOGGER = LoggerFactory.getLogger(DynamicExperimentActionContext.class);
+
+ final private IDynamicExperimentLocal experiment;
+
+ public DynamicExperimentActionContext(IDynamicExperimentLocal experiment) {
+ this.experiment = experiment;
+ }
+
+ @Override
+ public Object get(String variableName, Binding binding) {
+ return experiment.getVariableValueById(variableName);
+ }
+
+ @Override
+ public void set(String variableName, Object value, Binding binding) {
+ experiment.setVariableValueById(variableName, value, binding);
+ }
+
+}
--- /dev/null
+package org.simantics.simulator.toolkit;
+
+import java.math.BigDecimal;
+import java.util.ArrayList;
+import java.util.concurrent.Callable;
+import java.util.concurrent.CopyOnWriteArrayList;
+import java.util.concurrent.Semaphore;
+
+import org.simantics.simulator.ExperimentState;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+/**
+ * @author Antti Villberg
+ * @since 1.34.0
+ */
+abstract public class DynamicExperimentThread extends Thread {
+
+ private static final Logger LOGGER = LoggerFactory.getLogger(DynamicExperimentThread.class);
+
+ private CopyOnWriteArrayList<DynamicExperimentThreadListener> listeners = new CopyOnWriteArrayList<>();
+
+ private ExperimentState state = StandardExperimentStates.CREATED;
+
+ private long runStart = 0;
+
+ private double desiredRealtimeRatio = 1000.0;
+ private double obtainedRealtimeRatio = 1.0;
+
+ private long runTimeNs = 0;
+ private long endTimeNs = 0;
+ protected long simulationStepNs = 0;
+ protected double stepInSeconds = 1.0;
+
+ public DynamicExperimentThread() {
+ }
+
+ private void updateTimes() {
+ long time = System.nanoTime();
+ long elapsed = time-runStart;
+
+ obtainedRealtimeRatio = longToDoubleDivision(runTimeNs, elapsed);
+ }
+
+ private long rt;
+ private long rt_l;
+
+ protected double longToDoubleDivision(long l1, long l2) {
+ rt = l1 / l2;
+ rt_l = l1 % l2;
+ double d = ((double)rt_l)/((double)l2);
+ d += (double)rt;
+ return d;
+ }
+
+ protected ArrayList<Runnable> tasks = new ArrayList<>();
+
+ abstract public void step(double stepLengthNanoSeconds);
+
+ public boolean inState(Class<? extends ExperimentState> state) {
+ return state.isInstance(this.state);
+ }
+
+ public void initialize() throws Exception {
+ }
+
+ public void deinitialize() throws Exception {
+ }
+
+ long stepTime = 0;
+ long taskTime = 0;
+
+ @Override
+ public void run() {
+
+ try {
+
+ try {
+
+ initialize();
+
+ try {
+ runReally();
+ } catch (Exception e) {
+ LOGGER.error("Unhandled exception while running simulation thread", e);
+ }
+
+ } catch (Exception e) {
+ LOGGER.error("Unhandled exception while initializing simulation thread", e);
+ }
+
+ } finally {
+
+ try {
+ deinitialize();
+ } catch (Exception e) {
+ LOGGER.error("Error while deinitializing simulation thread", e);
+ }
+
+ }
+
+ }
+
+ protected boolean inActiveState() {
+ return !(
+ inState(StandardExperimentStates.Disposed.class)
+ || inState(StandardExperimentStates.Disposing.class)
+ //|| inState(StandardExperimentStates.Failure.class)
+ || inState(StandardExperimentStates.ToBeDisposed.class)
+ );
+ }
+
+ private void runReally() {
+
+ while(inActiveState()) {
+
+ if(inState(StandardExperimentStates.Running.class)) {
+
+ long asd = System.nanoTime();
+ step(simulationStepNs);
+ stepTime += System.nanoTime() - asd;
+ runTimeNs += simulationStepNs;
+ updateTimes();
+ long asd2 = System.nanoTime();
+ runTasks();
+ taskTime += System.nanoTime() - asd2;
+
+ System.err.println(" st = " + 1e-9*stepTime + " tt = " + 1e-9*taskTime);
+
+ while(obtainedRealtimeRatio > desiredRealtimeRatio) {
+ int ran = runTasks();
+ if(ran == 0) {
+ long elapsed = System.nanoTime()-runStart;
+ long deltaNs = BigDecimal.valueOf(runTimeNs).divide(BigDecimal.valueOf(desiredRealtimeRatio)).longValue() - elapsed;
+
+ long deltaMs = deltaNs / 1000000;
+ int deltaNsRem = (int)(deltaNs % 1000000);
+
+ if(deltaNs > 0) {
+ synchronized(tasks) {
+ try {
+ tasks.wait(deltaMs, deltaNsRem);
+ } catch (InterruptedException e) {
+ e.printStackTrace();
+ }
+ }
+ }
+ }
+ updateTimes();
+ }
+
+ } else {
+
+ while(!inState(StandardExperimentStates.Running.class) && inActiveState()) {
+
+ synchronized(tasks) {
+ int ran = runTasks();
+ if(ran == 0) {
+ try {
+ tasks.wait(Integer.MAX_VALUE);
+ } catch (InterruptedException e) {
+ e.printStackTrace();
+ }
+ }
+ }
+
+ }
+
+ }
+
+ if(runTimeNs >= endTimeNs && inActiveState())
+ changeState(StandardExperimentStates.STOPPED);
+
+ }
+
+ }
+
+ Thread executorThread = this;
+
+ Semaphore beginSyncExec = new Semaphore(0);
+ Semaphore endSyncExec = new Semaphore(0);
+
+ Runnable scheduleSyncExec = () -> {
+ beginSyncExec.release();
+ try {
+ endSyncExec.acquire();
+ } catch (InterruptedException e) {
+ }
+ };
+
+ public int runTasks() {
+ ArrayList<Runnable> todo = new ArrayList<>();
+ synchronized(tasks) {
+ todo.addAll(tasks);
+ tasks.clear();
+ }
+ todo.forEach(Runnable::run);
+ return todo.size();
+ }
+
+ public void queue(Runnable runnable) {
+ synchronized(tasks) {
+ tasks.add(runnable);
+ tasks.notify();
+ }
+ }
+
+ public <T> T syncExec(Callable<T> callable) throws InterruptedException {
+
+ if(executorThread == Thread.currentThread()) {
+ try {
+ return callable.call();
+ } catch (Throwable t) {
+ LOGGER.error("syncExec in current thread failed", t);
+ return null;
+ } finally {
+ }
+ }
+
+ queue(scheduleSyncExec);
+
+ beginSyncExec.acquire();
+ Thread oldThread = executorThread;
+ executorThread = Thread.currentThread();
+ try {
+ return callable.call();
+ } catch (Throwable t) {
+ LOGGER.error("syncExec failed", t);
+ return null;
+ } finally {
+ executorThread = oldThread;
+ endSyncExec.release();
+ }
+
+ }
+
+ public void asyncExec(Runnable runnable) {
+
+ if(executorThread == Thread.currentThread()) {
+ try {
+ runnable.run();
+ } catch (Throwable t) {
+ LOGGER.error("asyncExec failed", t);
+ } finally {
+ }
+ return;
+ }
+
+ queue(runnable);
+
+ }
+
+ public void setSimulationStepNs(long ns) {
+ simulationStepNs = ns;
+ stepInSeconds = BigDecimal.valueOf(simulationStepNs).multiply(BigDecimal.valueOf(1e-9)).doubleValue();
+ }
+
+ public void runDuration(long ns) {
+ runStart = System.nanoTime();
+ runTimeNs = 0;
+ endTimeNs = ns;
+ synchronized(tasks) {
+ changeState(StandardExperimentStates.RUNNING);
+ tasks.notify();
+ }
+ }
+
+ public ExperimentState getExperimentState() {
+ return state;
+ }
+
+ public void changeState(ExperimentState state) {
+ this.state = state;
+ fireStateChanged(state);
+ }
+
+ public void addListener(DynamicExperimentThreadListener listener) {
+ if(!listeners.contains(listener))
+ listeners.add(listener);
+ }
+
+ public void removeListener(DynamicExperimentThreadListener listener) {
+ listeners.remove(listener);
+ }
+
+ protected void fireAfterStep() {
+ listeners.forEach(DynamicExperimentThreadListener::afterStep);
+ }
+
+ protected void fireBeforeStep() {
+ listeners.forEach(DynamicExperimentThreadListener::beforeStep);
+ }
+
+ protected void fireStateChanged(ExperimentState newState) {
+ listeners.forEach(l -> l.stateChanged(newState));
+ }
+
+}
\ No newline at end of file
--- /dev/null
+package org.simantics.simulator.toolkit;
+
+import org.simantics.simulator.ExperimentState;
+
+/**
+ * @author Antti Villberg
+ * @since 1.34.0
+ */
+public interface DynamicExperimentThreadListener {
+ default void beforeStep() {}
+ default void afterStep() {}
+ default void stateChanged(ExperimentState newState) {}
+}
\ No newline at end of file
--- /dev/null
+package org.simantics.simulator.toolkit;
+
+import org.eclipse.core.runtime.NullProgressMonitor;
+import org.simantics.scl.runtime.SCLContext;
+import org.simantics.scl.runtime.function.Function;
+import org.simantics.simulator.IDynamicExperimentLocal;
+
+/**
+ * @author Antti Villberg
+ * @since 1.34.0
+ */
+public class DynamicExperimentThreadSequenceRunner {
+ @SuppressWarnings({ "rawtypes", "unchecked" })
+ public static DynamicExperimentActionContext runAction(IDynamicExperimentLocal experiment, DynamicExperimentThread thread, Function action, final boolean simulateAndWaitCompletion) {
+ final DynamicExperimentActionContext context = new DynamicExperimentActionContext(experiment);
+ context.scheduleNextStep(action);
+ final Object sync = new Object();
+ final SCLContext sclContext = SCLContext.getCurrent();
+
+ thread.addListener(new DynamicExperimentThreadListener() {
+
+ @Override
+ public void beforeStep() {
+ if(!context.isStopped()) {
+ SCLContext.push(sclContext);
+ context.handleStep(experiment.getSimulationTime());
+ SCLContext.pop();
+ }
+ removeIfStopped();
+ }
+
+ public void removeIfStopped() {
+ if(context.isStopped()) {
+ thread.removeListener(this);
+ if(simulateAndWaitCompletion) {
+ experiment.simulate(false);
+ synchronized(sync) {
+ sync.notify();
+ }
+ }
+ experiment.shutdown(new NullProgressMonitor());
+ }
+ }
+
+ });
+
+ if(simulateAndWaitCompletion) {
+ experiment.simulate(true);
+
+ try {
+ synchronized(sync) {
+ while(!context.isStopped())
+ sync.wait(1000L);
+ }
+ } catch(InterruptedException e) {
+ context.stop();
+ }
+
+ if (context.exceptions != null && !context.exceptions.isEmpty()) {
+ StringBuilder builder = new StringBuilder();
+ builder.append("Action failures:");
+ for (Exception e : context.exceptions) {
+ builder.append("\n");
+ builder.append(e.getMessage());
+ }
+
+ throw new RuntimeException(builder.toString());
+ }
+ }
+ return context;
+ }
+}
--- /dev/null
+package org.simantics.simulator.toolkit;
+
+import org.simantics.simulator.ExperimentState;
+
+/**
+ * @author Antti Villberg
+ * @since 1.34.0
+ */
+public class StandardExperimentStates implements ExperimentState {
+
+ /**
+ * The experiment context has been created.
+ *
+ * <p>
+ * Allowed successor states: {@link Instantiated}, {@link ToBeDisposed}
+ */
+ public interface Created extends ExperimentState {}
+
+ /**
+ * The experiment context has been instantiated, i.e. underlying experimentation
+ * resources have been acquired. Some experiment setup (e.g. setting certain
+ * values) can be done in this state that cannot be touched once the experiment
+ * is initialized.
+ *
+ * <p>
+ * Allowed successor states: {@link Initializing}, {@link ToBeDisposed}
+ */
+ public interface Instantiated extends ExperimentState {}
+
+ /**
+ * The experiment context is in the process of being initialized. This means
+ * that any initial conditions or related data is being loaded into the
+ * simulator/solver/what ever system that runs the experiment.
+ *
+ * If the initialization fails due to an irrecoverable system error, the
+ * experiment shall move to {@link Failure} state. If the initialization fails
+ * due to a recoverable error in user input, the experiment shall move to
+ * {@link Instantiated} state.
+ *
+ * <p>
+ * Allowed successor states: {@link Initialized}, {@link Instantiated},
+ * {@link Failure}
+ */
+ public interface Initializing extends ExperimentState {}
+
+ /**
+ * The experiment context has been initialized, i.e. the underlying
+ * experimentation resources have been both acquired and initialized with a
+ * specific state.
+ *
+ * <p>
+ * Allowed successor states: {@link Stopped}, {@link Running},
+ * {@link ToBeDisposed}
+ */
+ public interface Initialized extends ExperimentState {}
+
+ /**
+ * The experiment shall be ran until it reaches an objective such as
+ * steady-state or running for a duration or until the state changes to
+ * {@link Stopped}.
+ *
+ * <p>
+ * Allowed successor states: {@link Stopped}, {@link ToBeDisposed}
+ */
+ public interface Running extends ExperimentState {}
+
+ /**
+ * The experiment shall remain stopped. Everything in the experiment context
+ * should remain constant while in this state.
+ *
+ * <p>
+ * Allowed successor states: {@link Running}, {@link ToBeDisposed}
+ */
+ public interface Stopped extends ExperimentState {}
+
+ /**
+ * Moving into this state marks the beginning of the disposal of this
+ * experiment. The imminent disposal is irreversible and cannot be vetoed
+ * or interrupted.
+ *
+ * <p>
+ * Allowed successor states: {@link Disposing}
+ */
+ public interface ToBeDisposed extends ExperimentState {}
+
+ /**
+ * The experiment context is being disposed, i.e. underlying experimentation
+ * resources are being freed and disposed of accordingly.
+ *
+ * <p>
+ * Allowed successor states: {@link Disposing}
+ */
+ public interface Disposing extends ExperimentState {}
+
+ /**
+ * The experiment has been completely disposed and cannot any longer be used for
+ * anything.
+ *
+ * <p>
+ * Allowed successor states: none, this is a final state
+ */
+ public interface Disposed extends ExperimentState {}
+
+ /**
+ * The experiment implementation has ran into a fatal failure. The experiment
+ * context can still be accessed at this point but the experiment can only move
+ * into {@link Disposed} state.
+ *
+ * <p>
+ * Allowed successor states: {@link Disposed}
+ */
+ public interface Failure extends ExperimentState {}
+
+ public static final ExperimentState CREATED = new Created() {};
+ public static final ExperimentState INSTANTIATED = new Instantiated() {};
+ public static final ExperimentState INITIALIZING = new Initializing() {};
+ public static final ExperimentState INITIALIZED = new Initialized() {};
+ public static final ExperimentState RUNNING = new Running() {};
+ public static final ExperimentState STOPPED = new Stopped() {};
+ public static final ExperimentState TO_BE_DISPOSED = new ToBeDisposed() {};
+ public static final ExperimentState DISPOSING = new Disposing() {};
+ public static final ExperimentState DISPOSED = new Disposed() {};
+
+}
\ No newline at end of file
--- /dev/null
+package org.simantics.simulator.toolkit;
+
+/**
+ * Standard simulator variable node interface used with {@link StandardNodeManagerSupport}
+ * and {@link StandardNodeManager}.
+ *
+ * This used to exist in org.simantics.db.layer0 in earlier versions but was
+ * moved here to make it DB-independent.
+ *
+ * @author Antti Villberg
+ * @since 1.34.0
+ */
+public interface StandardNode {
+}
* VTT Technical Research Centre of Finland - initial API and implementation
* Semantum Oy - initial API and implementation
*******************************************************************************/
-package org.simantics.db.layer0;
+package org.simantics.simulator.toolkit;
import java.util.ArrayList;
import java.util.List;
import java.util.concurrent.atomic.AtomicBoolean;
import org.simantics.databoard.Bindings;
+import org.simantics.databoard.adapter.AdaptException;
+import org.simantics.databoard.adapter.Adapter;
+import org.simantics.databoard.adapter.AdapterConstructionException;
import org.simantics.databoard.binding.Binding;
import org.simantics.databoard.binding.VariantBinding;
-import org.simantics.databoard.binding.error.BindingConstructionException;
import org.simantics.databoard.binding.error.BindingException;
import org.simantics.databoard.binding.error.RuntimeBindingConstructionException;
import org.simantics.databoard.binding.mutable.Variant;
import org.simantics.databoard.type.Datatype;
-import org.simantics.db.exception.DatabaseException;
import org.simantics.simulator.variable.NodeManager;
import org.simantics.simulator.variable.Realm;
import org.simantics.simulator.variable.exceptions.NoSuchNodeException;
import org.simantics.simulator.variable.exceptions.NodeManagerException;
import org.simantics.simulator.variable.exceptions.NotInRealmException;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
import gnu.trove.map.hash.THashMap;
import gnu.trove.procedure.TObjectProcedure;
*
* @author Antti Villberg
*/
-public abstract class StandardNodeManager<Node,Engine extends StandardEngine<Node>> implements NodeManager<Node> {
-
- final private Node root;
- final private StandardRealm<Node,Engine> realm;
+public abstract class StandardNodeManager<Node, Engine extends StandardNodeManagerSupport<Node>> implements NodeManager<Node> {
- final static Binding NO_BINDING = new VariantBinding() {
+ private static final Logger LOGGER = LoggerFactory.getLogger(StandardNodeManager.class);
+
+ private final Node root;
+ private final StandardRealm<Node,Engine> realm;
+
+ static final Binding NO_BINDING = new VariantBinding() {
@Override
public Object getContent(Object variant, Binding contentBinding) throws BindingException {
public void assertInstaceIsValid(Object obj, Set<Object> validInstances) throws BindingException {
throw new Error();
}
-
+
@Override
public int compare(Object o1, Object o2) throws org.simantics.databoard.binding.error.RuntimeBindingException {
- if(o1 == null) {
- if(o2 == null) {
- return 0;
- } else {
- return - System.identityHashCode(o2);
- }
- } else {
- if(o2 == null) {
- return System.identityHashCode(o1);
- } else {
- if(o1.equals(o2)) return 0;
- return System.identityHashCode(o1) - System.identityHashCode(o2);
- }
- }
+ if(o1 == null) {
+ if(o2 == null) {
+ return 0;
+ } else {
+ return - System.identityHashCode(o2);
+ }
+ } else {
+ if(o2 == null) {
+ return System.identityHashCode(o1);
+ } else {
+ if(o1.equals(o2)) return 0;
+ return System.identityHashCode(o1) - System.identityHashCode(o2);
+ }
+ }
}
};
-
- THashMap<Node, Object> valueCache = new THashMap<Node, Object>();
- protected THashMap<Node, THashSet<Runnable>> listeners = new THashMap<Node, THashSet<Runnable>>();
-
+
+ THashMap<Node, Variant> valueCache = new THashMap<>();
+ protected THashMap<Node, THashSet<Runnable>> listeners = new THashMap<>();
+
AtomicBoolean fireNodeListenersScheduled = new AtomicBoolean(false);
Runnable fireNodeListeners = new Runnable() {
@Override
public void run() {
fireNodeListenersScheduled.set(false);
- final TObjectProcedure<Runnable> procedure = new TObjectProcedure<Runnable>() {
- @Override
- public boolean execute(Runnable object) {
- object.run();
- return true;
- }
+ TObjectProcedure<Runnable> procedure = r -> {
+ r.run();
+ return true;
};
synchronized(listeners) {
- listeners.forEachValue(new TObjectProcedure<THashSet<Runnable>>() {
- @Override
- public boolean execute(THashSet<Runnable> object) {
- object.forEach(procedure);
- return true;
- }
+ listeners.forEachValue(set -> {
+ set.forEach(procedure);
+ return true;
});
}
}
};
-
- Runnable clearValueCache = new Runnable() {
- @Override
- public void run() {
- valueCache.clear();
- }
- };
-
+
+ Runnable clearValueCache = () -> valueCache.clear();
+
public StandardNodeManager(StandardRealm<Node,Engine> realm, Node root) {
- this.realm = realm;
- this.root = root;
- }
-
- @Override
- public List<String> getChildNames(Node node) throws NodeManagerException {
- List<Node> children = getChildren(node);
- ArrayList<String> names = new ArrayList<String>(children.size());
- for(Node child : children)
- names.add(getName(child));
- return names;
- }
-
- @Override
- public List<String> getPropertyNames(Node node) throws NodeManagerException {
- List<Node> properties = getProperties(node);
- ArrayList<String> names = new ArrayList<String>(properties.size());
- for(Node property : properties)
- names.add(getName(property));
- return names;
- }
-
- @Override
- public Object getValue(Node node, String propertyName, Binding binding)
- throws NodeManagerException, BindingException {
- Node property = getProperty(node, propertyName);
- if(property == null)
- throw new NoSuchNodeException("Didn't find a property " + propertyName);
- return getValue(property, binding);
- }
-
- @Override
- public void setValue(Node node, String propertyName, Object value,
- Binding binding) throws NodeManagerException, BindingException {
- Node property = getProperty(node, propertyName);
- if(property == null)
- throw new NoSuchNodeException("Didn't find a property " + propertyName);
- setValue(property, value, binding);
- }
-
- @Override
- public Variant getValue(Node node) throws NodeManagerException {
- Object value = getEngineValueOrCached(node);
- if (value instanceof Variant)
- return (Variant) value;
+ assert(realm != null);
+ assert(root != null);
+ this.realm = realm;
+ this.root = root;
+ }
+
+ @Override
+ public List<String> getChildNames(Node node) throws NodeManagerException {
+ List<Node> children = getChildren(node);
+ ArrayList<String> names = new ArrayList<>(children.size());
+ for(Node child : children)
+ names.add(getName(child));
+ return names;
+ }
+
+ @Override
+ public List<String> getPropertyNames(Node node) throws NodeManagerException {
+ List<Node> properties = getProperties(node);
+ ArrayList<String> names = new ArrayList<>(properties.size());
+ for(Node property : properties)
+ names.add(getName(property));
+ return names;
+ }
+
+ @Override
+ public Object getValue(Node node, String propertyName, Binding binding)
+ throws NodeManagerException, BindingException {
+ Node property = getProperty(node, propertyName);
+ if(property == null)
+ throw new NoSuchNodeException("Didn't find a property " + propertyName);
+ return getValue(property, binding);
+ }
+
+ @Override
+ public void setValue(Node node, String propertyName, Object value,
+ Binding binding) throws NodeManagerException, BindingException {
+ Node property = getProperty(node, propertyName);
+ if(property == null)
+ throw new NoSuchNodeException("Didn't find a property " + propertyName);
+ setValue(property, value, binding);
+ }
+
+ @Override
+ public Variant getValue(Node node, String propertyName)
+ throws NodeManagerException {
+ Node property = getProperty(node, propertyName);
+ if(property == null)
+ throw new NoSuchNodeException("Didn't find a property " + propertyName);
+ return getValue(property);
+ }
+
+ @Override
+ public Object getValue(Node node, Binding binding) throws NodeManagerException, BindingException {
try {
- Binding binding = Bindings.getBinding(value.getClass());
- return new Variant(binding, value);
- } catch (BindingConstructionException e) {
- e.printStackTrace();
- return null;
+ return getValue(node).getValue(binding);
+ } catch (AdaptException e) {
+ throw new BindingException(e);
}
- }
-
- @Override
- public Variant getValue(Node node, String propertyName)
- throws NodeManagerException {
- Node property = getProperty(node, propertyName);
- if(property == null)
- throw new NoSuchNodeException("Didn't find a property " + propertyName);
- return getValue(property);
- }
-
+ }
+
@Override
public String getPropertyURI(Node parent, Node property) {
return null;
}
-
+
@Override
public Realm getRealm() {
- return realm;
+ return realm;
}
-
+
public StandardRealm<Node, Engine> getStandardRealm() {
- return realm;
+ return realm;
}
-
+
protected String getRealmId() {
- return realm.getId();
+ return realm.getId();
}
-
+
public Node getRoot() {
- return root;
+ return root;
}
-
+
protected boolean isRoot(Node node) {
- return root.equals(node);
+ return root.equals(node);
}
@Override
synchronized(listeners) {
THashSet<Runnable> l = listeners.get(node);
if(l == null) {
- l = new THashSet<Runnable>();
+ l = new THashSet<>();
listeners.put(node, l);
}
l.add(listener);
}
}
}
-
+
public void fireNodeListeners() {
if(!fireNodeListenersScheduled.getAndSet(true))
realm.asyncExec(fireNodeListeners);
}
-
+
public void fireNodeListenersSync() {
- try {
- realm.syncExec(fireNodeListeners);
- } catch (InterruptedException e) {
- e.printStackTrace();
- }
+ try {
+ realm.syncExec(fireNodeListeners);
+ } catch (InterruptedException e) {
+ LOGGER.error("Synchronous node listener firing was interrupted.", e);
+ }
}
public void refreshVariables() {
public void refreshVariablesSync() {
try {
- realm.syncExec(clearValueCache);
- } catch (InterruptedException e) {
- e.printStackTrace();
- }
+ realm.syncExec(clearValueCache);
+ } catch (InterruptedException e) {
+ LOGGER.error("Synchronous value cache refresh was interrupted.", e);
+ }
fireNodeListenersSync();
}
-
- protected Object getEngineValueOrCached(Node node) throws NodeManagerException {
- Object value = valueCache.get(node);
- if(value == null) {
- value = realm.getEngine().getValue(node);
- valueCache.put(node, value);
+
+ protected Variant getEngineVariantOrCached(Node node) throws NodeManagerException {
+ Variant variant = valueCache.get(node);
+ if(variant == null) {
+ Object value = realm.getEngine().getEngineValue(node);
+ Binding binding = realm.getEngine().getEngineBinding(node);
+ variant = new Variant(binding, value);
+ valueCache.put(node, variant);
}
- return value;
+ return variant;
}
-
@Override
- public Object getValue(Node node, Binding binding) throws NodeManagerException {
+ public Variant getValue(Node node) throws NodeManagerException {
checkThreadAccess();
- return getEngineValueOrCached(node);
+ return getEngineVariantOrCached(node);
}
protected void checkThreadAccess() throws NodeManagerException {
if(Thread.currentThread() != realm.getThread())
throw new NotInRealmException();
}
-
- protected Datatype getDatatypeForValue(Object value) throws DatabaseException {
- Binding binding = Bindings.getBindingUnchecked(value.getClass());
- if(binding == null) return null;
- else return binding.type();
+
+ protected Datatype getDatatypeForValue(Object value) {
+ Binding binding = Bindings.getBindingUnchecked(value.getClass());
+ if(binding == null) return null;
+ else return binding.type();
}
-
+
@Override
public void setValue(Node node, Object value, Binding binding)
throws NodeManagerException {
- checkThreadAccess();
- valueCache.put(node, value);
- realm.getEngine().setValue(node, value);
- realm.getNodeManager().valueCache.put(node, value);
- refreshVariables();
+ checkThreadAccess();
+ Binding targetBinding = realm.getEngine().getEngineBinding(node);
+ if(binding.equals(targetBinding)) {
+ Variant variant = new Variant(binding, value);
+ valueCache.put(node, variant);
+ realm.getEngine().setEngineValue(node, value);
+ } else {
+ try {
+ Adapter adapter = Bindings.getAdapter(binding, targetBinding);
+ Object targetValue = adapter.adapt(value);
+ Variant variant = new Variant(targetBinding, targetValue);
+ valueCache.put(node, variant);
+ realm.getEngine().setEngineValue(node, targetValue);
+ } catch (AdapterConstructionException e) {
+ throw new NodeManagerException(e);
+ } catch (AdaptException e) {
+ throw new NodeManagerException(e);
+ }
+ }
+ refreshVariables();
}
-
+
@Override
public String getName(Node node) {
if(isRoot(node)) {
- String id = getRealmId();
- int lastSlash = id.lastIndexOf("/");
- if(lastSlash == -1) throw new IllegalStateException("Invalid realm id " + id);
- String name = id.substring(lastSlash+1);
- return name;
+ String id = getRealmId();
+ int lastSlash = id.lastIndexOf("/");
+ if(lastSlash == -1) throw new IllegalStateException("Invalid realm id " + id);
+ String name = id.substring(lastSlash+1);
+ return name;
} else {
- return realm.getEngine().getName(node);
+ return realm.getEngine().getName(node);
}
}
-
@Override
public Node getNode(String path) throws NodeManagerException {
checkThreadAccess();
throw new UnsupportedOperationException();
}
-
+
@Override
public Node getChild(Node node, String name) throws NodeManagerException {
checkThreadAccess();
- Map<String,Node> map = realm.getEngine().getChildren(node);
- return map.get(name);
+ Map<String,Node> map = realm.getEngine().getChildren(node);
+ return map.get(name);
}
@Override
public Node getProperty(Node node, String name) throws NodeManagerException {
- checkThreadAccess();
- Map<String,Node> map = realm.getEngine().getProperties(node);
- return map.get(name);
+ checkThreadAccess();
+ Map<String,Node> map = realm.getEngine().getProperties(node);
+ return map.get(name);
}
@Override
public Datatype getDatatype(Node node) throws NodeManagerException {
checkThreadAccess();
try {
- Datatype type = getDatatypeForValue(getEngineValueOrCached(node));
- return type;
- } catch (DatabaseException e) {
- e.printStackTrace();
+ Variant v = getEngineVariantOrCached(node);
+ Binding b = v.getBinding();
+ if(b == null) return null;
+ return b.type();
} catch (RuntimeBindingConstructionException e) {
// There is no datatype for all values
}
valueCache.clear();
listeners.clear();
}
-}
+
+}
\ No newline at end of file
--- /dev/null
+package org.simantics.simulator.toolkit;
+
+import java.util.Map;
+
+import org.simantics.databoard.binding.Binding;
+import org.simantics.simulator.variable.NodeManager;
+import org.simantics.simulator.variable.exceptions.NodeManagerException;
+
+/**
+ * This interface is a simplified version of {@link NodeManager} that only
+ * provides node structure retrieval and property value getting and setting
+ * without regard to realms or listeners.
+ *
+ * This used to exist in org.simantics.db.layer0 in earlier versions but was
+ * moved here to make it DB-independent.
+ *
+ * @author Antti Villberg
+ * @since 1.34.0
+ * @param <Node>
+ */
+public interface StandardNodeManagerSupport<Node> {
+
+ Object getEngineValue(Node node) throws NodeManagerException;
+ Binding getEngineBinding(Node node) throws NodeManagerException;
+ void setEngineValue(Node node, Object value) throws NodeManagerException;
+ String getName(Node node);
+ Map<String,Node> getChildren(Node node);
+ Map<String,Node> getProperties(Node node);
+
+}
-package org.simantics.db.layer0;
+package org.simantics.simulator.toolkit;
import java.util.List;
-import java.util.concurrent.ExecutorService;
import java.util.concurrent.LinkedBlockingQueue;
import java.util.concurrent.Semaphore;
import java.util.concurrent.ThreadFactory;
import java.util.concurrent.TimeUnit;
import java.util.function.Function;
-import org.simantics.db.common.utils.Logger;
import org.simantics.scl.runtime.SCLContext;
import org.simantics.scl.runtime.tuple.Tuple0;
import org.simantics.simulator.variable.Realm;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
-abstract public class StandardRealm<Node, Engine extends StandardEngine<Node>> implements Realm {
+abstract public class StandardRealm<Node, Engine extends StandardNodeManagerSupport<Node>> implements Realm {
+
+ private static final Logger LOGGER = LoggerFactory.getLogger(StandardRealm.class);
private String id;
- private Thread executorThread;
+ protected Thread executorThread;
private StandardRealmThreadFactory factory = new StandardRealmThreadFactory(this);
private ThreadPoolExecutor executor = new ThreadPoolExecutor(0, 1, 60, TimeUnit.SECONDS,
- new LinkedBlockingQueue<Runnable>(), factory);
+ new LinkedBlockingQueue<>(), factory);
private Semaphore beginSyncExec = new Semaphore(0);
private Semaphore endSyncExec = new Semaphore(0);
-
+
private Engine engine;
- private StandardNodeManager<Node, Engine> nodeManager;
-
+ protected StandardNodeManager<Node, Engine> nodeManager;
+
private Runnable scheduleSyncExec = new Runnable() {
@Override
public void run() {
}
}
};
-
+
protected StandardRealm(Engine engine, String id) {
this.engine = engine;
this.id = id;
}
abstract protected StandardNodeManager<Node, Engine> createManager();
-
+
protected String getSCLContextKey() {
- return getClass().getSimpleName();
+ return getClass().getSimpleName();
}
public String getId() {
return id;
}
-
+
public Engine getEngine() {
return engine;
}
public Thread getThread() {
return executorThread;
}
-
+
@SuppressWarnings({ "rawtypes", "unchecked" })
public Object syncExec(Function fun) throws InterruptedException {
-
executor.execute(scheduleSyncExec);
SCLContext context = SCLContext.getCurrent();
Engine oldConnection = (Engine)context.put(getSCLContextKey(), engine);
-
+
try {
beginSyncExec.acquire();
Thread oldThread = executorThread;
context.put(getSCLContextKey(), oldConnection);
}
}
-
+
@SuppressWarnings("rawtypes")
public void asyncExec(final Function fun) {
executor.execute(new Runnable() {
@Override
public void syncExec(Runnable runnable) throws InterruptedException {
-
if(executorThread == Thread.currentThread()) {
try {
runnable.run();
} catch (Throwable t) {
- Logger.defaultLogError(t);
+ LOGGER.error("Error executing runnable in realm", t);
} finally {
}
return;
}
- executor.execute(scheduleSyncExec);
-
+ executor.execute(scheduleSyncExec);
+
beginSyncExec.acquire();
Thread oldThread = executorThread;
executorThread = Thread.currentThread();
try {
runnable.run();
} catch (Throwable t) {
- Logger.defaultLogError(t);
+ LOGGER.error("Error executing runnable in realm", t);
} finally {
executorThread = oldThread;
endSyncExec.release();
@Override
public void asyncExec(Runnable runnable) {
-
- if(executorThread == Thread.currentThread()) {
+ if(executorThread == Thread.currentThread()) {
try {
runnable.run();
} catch (Throwable t) {
- Logger.defaultLogError(t);
+ LOGGER.error("Error executing runnable in realm", t);
} finally {
}
return;
}
-
+
executor.execute(runnable);
}
-
+
public void close() {
executor.shutdown();
try {
} catch (InterruptedException e) {
getLogger().info("Could not shutdown executor " + executor + " for realm " + this, e);
}
-
+
factory.clear();
factory = null;
// Should never be true
executorThread.interrupt();
executorThread = null;
executor = null;
-
+
// Clear nodeManager
nodeManager.clear();
nodeManager = null;
}
private static class StandardRealmThreadFactory implements ThreadFactory {
-
+
private StandardRealm<?, ?> realm;
public StandardRealmThreadFactory(StandardRealm<?, ?> realm) {
this.realm = realm;
}
-
+
@Override
public Thread newThread(Runnable r) {
Thread t = new Thread(r);
realm.setExecutorThread(t);
return t;
}
-
+
void clear() {
realm = null;
}
}
-}
+
+}
\ No newline at end of file
--- /dev/null
+<?xml version="1.0" encoding="UTF-8"?>\r
+<classpath>\r
+ <classpathentry kind="con" path="org.eclipse.jdt.launching.JRE_CONTAINER/org.eclipse.jdt.internal.debug.ui.launcher.StandardVMType/JavaSE-1.8"/>\r
+ <classpathentry kind="con" path="org.eclipse.pde.core.requiredPlugins"/>\r
+ <classpathentry kind="src" path="src"/>\r
+ <classpathentry kind="output" path="bin"/>\r
+</classpath>\r
--- /dev/null
+<?xml version="1.0" encoding="UTF-8"?>\r
+<projectDescription>\r
+ <name>org.simantics.simulator</name>\r
+ <comment></comment>\r
+ <projects>\r
+ </projects>\r
+ <buildSpec>\r
+ <buildCommand>\r
+ <name>org.eclipse.jdt.core.javabuilder</name>\r
+ <arguments>\r
+ </arguments>\r
+ </buildCommand>\r
+ <buildCommand>\r
+ <name>org.eclipse.pde.ManifestBuilder</name>\r
+ <arguments>\r
+ </arguments>\r
+ </buildCommand>\r
+ <buildCommand>\r
+ <name>org.eclipse.pde.SchemaBuilder</name>\r
+ <arguments>\r
+ </arguments>\r
+ </buildCommand>\r
+ </buildSpec>\r
+ <natures>\r
+ <nature>org.eclipse.pde.PluginNature</nature>\r
+ <nature>org.eclipse.jdt.core.javanature</nature>\r
+ </natures>\r
+</projectDescription>\r
--- /dev/null
+Manifest-Version: 1.0
+Bundle-ManifestVersion: 2
+Bundle-Name: Simulator
+Bundle-SymbolicName: org.simantics.simulator
+Bundle-Version: 1.0.0.qualifier
+Bundle-RequiredExecutionEnvironment: JavaSE-1.8
+Require-Bundle: org.slf4j.api;bundle-version="1.7.25",
+ org.eclipse.core.runtime,
+ org.simantics.databoard;bundle-version="0.6.6"
+Export-Package: org.simantics.simulator
--- /dev/null
+source.. = src/\r
+output.. = bin/\r
+bin.includes = META-INF/,\\r
+ .\r
--- /dev/null
+/*******************************************************************************
+ * Copyright (c) 2018 Association for Decentralized Information Management
+ * in Industry THTH ry.
+ * All rights reserved. This program and the accompanying materials
+ * are made available under the terms of the Eclipse Public License v1.0
+ * which accompanies this distribution, and is available at
+ * http://www.eclipse.org/legal/epl-v10.html
+ *
+ * Contributors:
+ * Semantum Oy - initial API and implementation
+ *******************************************************************************/
+package org.simantics.simulator;
+
+/**
+ * An abstract representation of the state of an experiment.
+ *
+ * <p>
+ * A standard set of states can be found in
+ * <code>org.simantics.simulator.toolkit.StandardExperimentStates</code>.
+ *
+ * @author Antti Villberg
+ * @since 1.34.0
+ */
+public interface ExperimentState {
+}
\ No newline at end of file
--- /dev/null
+/*******************************************************************************
+ * Copyright (c) 2018 Association for Decentralized Information Management
+ * in Industry THTH ry.
+ * All rights reserved. This program and the accompanying materials
+ * are made available under the terms of the Eclipse Public License v1.0
+ * which accompanies this distribution, and is available at
+ * http://www.eclipse.org/legal/epl-v10.html
+ *
+ * Contributors:
+ * Semantum Oy - initial API and implementation
+ *******************************************************************************/
+package org.simantics.simulator;
+
+import org.simantics.databoard.binding.Binding;
+
+public interface IDynamicExperimentLocal extends IExperimentLocal {
+
+ /**
+ * Starts or stops simulation depending on the
+ * parameter.
+ */
+ public void simulate(boolean enabled);
+
+ /**
+ * Simulates the experiment at lest the given period of time.
+ * Giving 0 as parameter simulates the experiment one 'step'.
+ * After the duration, the simulation is stopped.
+ */
+ public void simulateDuration(double duration);
+
+ void setVariableValueById(String id, Object value, Binding binding);
+ Object getVariableValueById(String id);
+
+ double getSimulationTime();
+
+}
--- /dev/null
+/*******************************************************************************
+ * Copyright (c) 2018 Association for Decentralized Information Management
+ * in Industry THTH ry.
+ * All rights reserved. This program and the accompanying materials
+ * are made available under the terms of the Eclipse Public License v1.0
+ * which accompanies this distribution, and is available at
+ * http://www.eclipse.org/legal/epl-v10.html
+ *
+ * Contributors:
+ * Semantum Oy - initial API and implementation
+ *******************************************************************************/
+package org.simantics.simulator;
+
+import org.eclipse.core.runtime.IProgressMonitor;
+
+public interface IExperimentLocal {
+
+ <T> T getService(Class<T> clazz);
+
+ String getIdentifier();
+
+ /**
+ * @param monitor
+ * the progress monitor to use for reporting progress to the user
+ * during the operation. It is the caller's responsibility to
+ * call done() on the given monitor. Accepts null, indicating
+ * that no progress should be reported and that the operation
+ * cannot be cancelled.
+ */
+ void shutdown(IProgressMonitor monitor);
+
+ ExperimentState getStateL();
+ void changeStateL(ExperimentState state);
+
+}
org.eclipse.e4.core.contexts,
org.eclipse.e4.ui.di,
org.simantics.browsing.ui.swt,
- org.slf4j.api;bundle-version="1.7.20"
+ org.slf4j.api;bundle-version="1.7.20",
+ org.simantics.simulator.toolkit;bundle-version="1.0.0",
+ org.simantics.simulator.toolkit.db;bundle-version="1.0.0"
Export-Package: org.apache.commons.math3.stat.regression,
org.simantics.spreadsheet.graph,
org.simantics.spreadsheet.graph.adapter,
import org.simantics.db.common.request.WriteResultRequest;
import org.simantics.db.common.session.SessionEventListenerAdapter;
import org.simantics.db.exception.DatabaseException;
-import org.simantics.db.layer0.StandardRealm;
import org.simantics.db.layer0.request.PossibleURIVariable;
import org.simantics.db.layer0.request.VariableName;
import org.simantics.db.layer0.request.VariableRead;
import org.simantics.db.request.Write;
import org.simantics.db.service.SessionEventSupport;
import org.simantics.layer0.Layer0;
+import org.simantics.simulator.toolkit.StandardRealm;
import org.simantics.spreadsheet.Adaptable;
import org.simantics.spreadsheet.CellEditor;
import org.simantics.spreadsheet.ClientModel;
import java.util.Map;
import java.util.Optional;
+import org.simantics.databoard.Bindings;
+import org.simantics.databoard.binding.Binding;
import org.simantics.databoard.binding.mutable.Variant;
import org.simantics.db.exception.DatabaseException;
-import org.simantics.db.layer0.StandardEngine;
+import org.simantics.simulator.toolkit.StandardNodeManagerSupport;
+import org.simantics.simulator.variable.exceptions.NodeManagerException;
import org.simantics.spreadsheet.graph.formula.SpreadsheetEvaluationEnvironment;
import org.simantics.spreadsheet.graph.synchronization.LineNodeUpdater;
import org.simantics.spreadsheet.graph.synchronization.LineUpdater;
import it.unimi.dsi.fastutil.longs.LongArraySet;
import it.unimi.dsi.fastutil.longs.LongLinkedOpenHashSet;
-public class SpreadsheetBook implements SpreadsheetElement<SpreadsheetElement, SpreadsheetElement>, StandardEngine<SheetNode>, Serializable, SheetNode<SpreadsheetEngine, SheetNode>, Solver, SolverNameUtil, ComponentFactory<SheetLineComponent>, ModuleUpdaterFactoryBase<SheetLineComponent> {
+public class SpreadsheetBook implements StandardNodeManagerSupport<SheetNode>, SpreadsheetElement<SpreadsheetElement, SpreadsheetElement>, Serializable, SheetNode<SpreadsheetEngine, SheetNode>, Solver, SolverNameUtil, ComponentFactory<SheetLineComponent>, ModuleUpdaterFactoryBase<SheetLineComponent> {
private static final long serialVersionUID = 7417208688311691396L;
}
@Override
- public Object getValue(SheetNode node) {
+ public Binding getEngineBinding(SheetNode node) throws NodeManagerException {
+ Object value = getEngineValue(node);
+ if(value instanceof Variant) return Bindings.VARIANT;
+ if(value instanceof String) return Bindings.STRING;
+ if(value instanceof Boolean) return Bindings.BOOLEAN;
+ else return Bindings.VOID;
+
+ }
+
+ @Override
+ public Object getEngineValue(SheetNode node) {
if(node instanceof SpreadsheetCellContent) {
try {
SpreadsheetCellContent scc = (SpreadsheetCellContent)node;
}
@Override
- public void setValue(SheetNode node, Object value) {
+ public void setEngineValue(SheetNode node, Object value) {
}
@Override
import org.simantics.spreadsheet.Range;
import org.simantics.spreadsheet.graph.parser.ast.AstRange;
+@SuppressWarnings("rawtypes")
public class SpreadsheetEngine implements SpreadsheetElement, SheetNode {
private static final long serialVersionUID = -5246063647558595642L;
import org.simantics.db.common.utils.LiteralFileUtil;
import org.simantics.db.exception.DatabaseException;
import org.simantics.db.exception.ServiceException;
-import org.simantics.db.layer0.StandardRealm;
import org.simantics.db.layer0.util.Layer0Utils;
import org.simantics.db.layer0.variable.Variable;
import org.simantics.db.layer0.variable.Variables;
import org.simantics.db.service.ClusteringSupport;
import org.simantics.layer0.Layer0;
import org.simantics.scl.runtime.tuple.Tuple2;
+import org.simantics.simulator.toolkit.StandardRealm;
import org.simantics.spreadsheet.Range;
import org.simantics.spreadsheet.graph.synchronization.SpreadsheetSynchronizationEventHandler;
import org.simantics.spreadsheet.resource.SpreadsheetResource;
import java.util.Collections;
import java.util.Set;
-import org.simantics.db.layer0.StandardNodeManager;
import org.simantics.layer0.Layer0;
+import org.simantics.simulator.toolkit.StandardNodeManager;
import org.simantics.simulator.variable.exceptions.NodeManagerException;
import org.simantics.spreadsheet.resource.SpreadsheetResource;
import org.simantics.structural.stubs.StructuralResource2;
@SuppressWarnings("rawtypes")
public class SpreadsheetNodeManager extends StandardNodeManager<SheetNode, SpreadsheetBook> {
-
+
public SpreadsheetNodeManager(SpreadsheetRealm realm) {
super(realm, realm.getEngine());
}
static final Set<String> COMPONENT_CLASS = Collections.singleton(StructuralResource2.URIs.Component);
-
+
@Override
public Set<String> getClassifications(SheetNode node) throws NodeManagerException {
checkThreadAccess();
else
return Collections.emptySet();
}
-
+
@Override
public String getPropertyURI(SheetNode parent, SheetNode property) {
- if(property instanceof SpreadsheetCellContent) {
+ if(property instanceof SpreadsheetCellContent) {
return SpreadsheetResource.URIs.Cell_content;
- } else if(property instanceof SpreadsheetTypeNode) {
+ } else if(property instanceof SpreadsheetTypeNode) {
return Layer0.URIs.typeURI;
- } else if(property instanceof SpreadsheetCellContentExpression) {
+ } else if(property instanceof SpreadsheetCellContentExpression) {
return Layer0.URIs.SCLValue_expression;
- } else if (property instanceof SpreadsheetCellStyle) {
- return SpreadsheetResource.URIs.Cell_style;
- } else if (property instanceof SpreadsheetCellEditable){
- return SpreadsheetResource.URIs.Cell_editable;
- } else {
- return null;
- }
+ } else if (property instanceof SpreadsheetCellStyle) {
+ return SpreadsheetResource.URIs.Cell_style;
+ } else if (property instanceof SpreadsheetCellEditable){
+ return SpreadsheetResource.URIs.Cell_editable;
+ } else {
+ return null;
+ }
}
-
+
}
package org.simantics.spreadsheet.graph;
-import org.simantics.db.layer0.StandardNodeManager;
-import org.simantics.db.layer0.StandardRealm;
+import org.simantics.simulator.toolkit.StandardNodeManager;
+import org.simantics.simulator.toolkit.StandardRealm;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
+@SuppressWarnings("rawtypes")
public class SpreadsheetRealm extends StandardRealm<SheetNode,SpreadsheetBook> {
private static final Logger LOGGER = LoggerFactory.getLogger(SpreadsheetRealm.class);
-
+
SpreadsheetRealm(SpreadsheetBook book, String id) {
super(book, id);
}
import org.simantics.db.Resource;
import org.simantics.db.WriteGraph;
import org.simantics.db.exception.DatabaseException;
-import org.simantics.db.layer0.StandardRealm;
-import org.simantics.db.layer0.StandardSessionManager;
import org.simantics.db.layer0.variable.ProxyVariables;
import org.simantics.db.layer0.variable.Variable;
import org.simantics.db.layer0.variable.Variables;
+import org.simantics.simulator.toolkit.StandardRealm;
+import org.simantics.simulator.toolkit.db.StandardSessionManager;
import org.simantics.spreadsheet.graph.formula.SpreadsheetEvaluationEnvironment;
import org.simantics.spreadsheet.graph.synchronization.SpreadsheetSynchronizationEventHandler;
import org.simantics.spreadsheet.resource.SpreadsheetResource;
import org.simantics.db.common.utils.Logger;
import org.simantics.db.common.utils.NameUtils;
import org.simantics.db.exception.DatabaseException;
-import org.simantics.db.layer0.StandardRealm;
import org.simantics.db.layer0.exception.MissingVariableException;
import org.simantics.db.layer0.function.StandardChildDomainChildren;
import org.simantics.db.layer0.request.PossibleActiveRun;
import org.simantics.document.server.io.ITableCell;
import org.simantics.layer0.Layer0;
import org.simantics.scl.reflection.annotations.SCLValue;
+import org.simantics.simulator.toolkit.StandardRealm;
import org.simantics.simulator.variable.exceptions.NodeManagerException;
import org.simantics.spreadsheet.CellEditor;
import org.simantics.spreadsheet.ClientModel;
<modules>
<module>com.famfamfam.silk</module>
+ <module>hdf.hdf5lib</module>
<module>org.simantics</module>
<module>org.simantics.acorn</module>
<module>org.simantics.action.ontology</module>
<module>org.simantics.layer0.utils</module>
<module>org.simantics.layer0x.ontology</module>
<module>org.simantics.logback.configuration</module>
+ <module>org.simantics.logging</module>
+ <module>org.simantics.logging.ui</module>
<module>org.simantics.ltk</module>
<module>org.simantics.ltk.antlr</module>
<module>org.simantics.lz4</module>
<module>org.simantics.simulation.ontology</module>
<module>org.simantics.simulation.sequences</module>
<module>org.simantics.simulation.ui</module>
+ <module>org.simantics.simulator</module>
+ <module>org.simantics.simulator.toolkit</module>
+ <module>org.simantics.simulator.toolkit.db</module>
<module>org.simantics.simulator.variable</module>
<module>org.simantics.softwareconfiguration.ontology</module>
<module>org.simantics.spreadsheet</module>
--- /dev/null
+<?xml version="1.0" encoding="UTF-8"?>\r
+<projectDescription>\r
+ <name>hdf.hdf5.feature</name>\r
+ <comment></comment>\r
+ <projects>\r
+ </projects>\r
+ <buildSpec>\r
+ <buildCommand>\r
+ <name>org.eclipse.pde.FeatureBuilder</name>\r
+ <arguments>\r
+ </arguments>\r
+ </buildCommand>\r
+ </buildSpec>\r
+ <natures>\r
+ <nature>org.eclipse.pde.FeatureNature</nature>\r
+ </natures>\r
+</projectDescription>\r
--- /dev/null
+\r
+Copyright Notice and License Terms for \r
+HDF5 (Hierarchical Data Format 5) Software Library and Utilities\r
+-----------------------------------------------------------------------------\r
+\r
+HDF5 (Hierarchical Data Format 5) Software Library and Utilities\r
+Copyright 2006-2016 by The HDF Group.\r
+\r
+NCSA HDF5 (Hierarchical Data Format 5) Software Library and Utilities\r
+Copyright 1998-2006 by the Board of Trustees of the University of Illinois.\r
+\r
+All rights reserved.\r
+\r
+Redistribution and use in source and binary forms, with or without \r
+modification, are permitted for any purpose (including commercial purposes) \r
+provided that the following conditions are met:\r
+\r
+1. Redistributions of source code must retain the above copyright notice, \r
+ this list of conditions, and the following disclaimer.\r
+\r
+2. Redistributions in binary form must reproduce the above copyright notice, \r
+ this list of conditions, and the following disclaimer in the documentation \r
+ and/or materials provided with the distribution.\r
+\r
+3. In addition, redistributions of modified forms of the source or binary \r
+ code must carry prominent notices stating that the original code was \r
+ changed and the date of the change.\r
+\r
+4. All publications or advertising materials mentioning features or use of \r
+ this software are asked, but not required, to acknowledge that it was \r
+ developed by The HDF Group and by the National Center for Supercomputing \r
+ Applications at the University of Illinois at Urbana-Champaign and \r
+ credit the contributors.\r
+\r
+5. Neither the name of The HDF Group, the name of the University, nor the \r
+ name of any Contributor may be used to endorse or promote products derived \r
+ from this software without specific prior written permission from \r
+ The HDF Group, the University, or the Contributor, respectively.\r
+\r
+DISCLAIMER: \r
+THIS SOFTWARE IS PROVIDED BY THE HDF GROUP AND THE CONTRIBUTORS \r
+"AS IS" WITH NO WARRANTY OF ANY KIND, EITHER EXPRESSED OR IMPLIED. In no \r
+event shall The HDF Group or the Contributors be liable for any damages \r
+suffered by the users arising out of the use of this software, even if \r
+advised of the possibility of such damage. \r
+\r
+-----------------------------------------------------------------------------\r
+-----------------------------------------------------------------------------\r
+\r
+Contributors: National Center for Supercomputing Applications (NCSA) at \r
+the University of Illinois, Fortner Software, Unidata Program Center (netCDF), \r
+The Independent JPEG Group (JPEG), Jean-loup Gailly and Mark Adler (gzip), \r
+and Digital Equipment Corporation (DEC).\r
+\r
+-----------------------------------------------------------------------------\r
+\r
+Portions of HDF5 were developed with support from the Lawrence Berkeley \r
+National Laboratory (LBNL) and the United States Department of Energy \r
+under Prime Contract No. DE-AC02-05CH11231.\r
+\r
+-----------------------------------------------------------------------------\r
+\r
+Portions of HDF5 were developed with support from the University of \r
+California, Lawrence Livermore National Laboratory (UC LLNL). \r
+The following statement applies to those portions of the product and must \r
+be retained in any redistribution of source code, binaries, documentation, \r
+and/or accompanying materials:\r
+\r
+ This work was partially produced at the University of California, \r
+ Lawrence Livermore National Laboratory (UC LLNL) under contract \r
+ no. W-7405-ENG-48 (Contract 48) between the U.S. Department of Energy \r
+ (DOE) and The Regents of the University of California (University) \r
+ for the operation of UC LLNL.\r
+\r
+ DISCLAIMER: \r
+ This work was prepared as an account of work sponsored by an agency of \r
+ the United States Government. Neither the United States Government nor \r
+ the University of California nor any of their employees, makes any \r
+ warranty, express or implied, or assumes any liability or responsibility \r
+ for the accuracy, completeness, or usefulness of any information, \r
+ apparatus, product, or process disclosed, or represents that its use \r
+ would not infringe privately- owned rights. Reference herein to any \r
+ specific commercial products, process, or service by trade name, \r
+ trademark, manufacturer, or otherwise, does not necessarily constitute \r
+ or imply its endorsement, recommendation, or favoring by the United \r
+ States Government or the University of California. The views and \r
+ opinions of authors expressed herein do not necessarily state or reflect \r
+ those of the United States Government or the University of California, \r
+ and shall not be used for advertising or product endorsement purposes.\r
+\r
+-----------------------------------------------------------------------------\r
+\r
+HDF5 is available with the SZIP compression library but SZIP is not part \r
+of HDF5 and has separate copyright and license terms. See \93Szip Compression \r
+in HDF Products\94 (www.hdfgroup.org/doc_resource/SZIP/) for further details.\r
+\r
+-----------------------------------------------------------------------------\r
+\r
--- /dev/null
+bin.includes = feature.xml,\\r
+ HDF5.license.txt\r
--- /dev/null
+<?xml version="1.0" encoding="UTF-8"?>
+<feature
+ id="hdf.hdf5"
+ label="HDF5 for Java"
+ version="1.10.0.patch1"
+ provider-name="Semantum Oy">
+
+ <description>
+ This feature bundles together the Java HDF5 Interface library
+(hdf.hdf5lib) and its dependencies.
+ </description>
+
+ <copyright url="https://svn.hdfgroup.org/hdf5/trunk/COPYING">
+ Copyright Notice and License Terms for
+HDF5 (Hierarchical Data Format 5) Software Library and Utilities
+-----------------------------------------------------------------------------
+
+HDF5 (Hierarchical Data Format 5) Software Library and Utilities
+Copyright 2006-2016 by The HDF Group.
+
+NCSA HDF5 (Hierarchical Data Format 5) Software Library and Utilities
+Copyright 1998-2006 by the Board of Trustees of the University of Illinois.
+
+All rights reserved.
+
+Redistribution and use in source and binary forms, with or without
+modification, are permitted for any purpose (including commercial purposes)
+provided that the following conditions are met:
+
+1. Redistributions of source code must retain the above copyright notice,
+ this list of conditions, and the following disclaimer.
+
+2. Redistributions in binary form must reproduce the above copyright notice,
+ this list of conditions, and the following disclaimer in the documentation
+ and/or materials provided with the distribution.
+
+3. In addition, redistributions of modified forms of the source or binary
+ code must carry prominent notices stating that the original code was
+ changed and the date of the change.
+
+4. All publications or advertising materials mentioning features or use of
+ this software are asked, but not required, to acknowledge that it was
+ developed by The HDF Group and by the National Center for Supercomputing
+ Applications at the University of Illinois at Urbana-Champaign and
+ credit the contributors.
+
+5. Neither the name of The HDF Group, the name of the University, nor the
+ name of any Contributor may be used to endorse or promote products derived
+ from this software without specific prior written permission from
+ The HDF Group, the University, or the Contributor, respectively.
+
+DISCLAIMER:
+THIS SOFTWARE IS PROVIDED BY THE HDF GROUP AND THE CONTRIBUTORS
+"AS IS" WITH NO WARRANTY OF ANY KIND, EITHER EXPRESSED OR IMPLIED. In no
+event shall The HDF Group or the Contributors be liable for any damages
+suffered by the users arising out of the use of this software, even if
+advised of the possibility of such damage.
+
+-----------------------------------------------------------------------------
+-----------------------------------------------------------------------------
+
+Contributors: National Center for Supercomputing Applications (NCSA) at
+the University of Illinois, Fortner Software, Unidata Program Center (netCDF),
+The Independent JPEG Group (JPEG), Jean-loup Gailly and Mark Adler (gzip),
+and Digital Equipment Corporation (DEC).
+
+-----------------------------------------------------------------------------
+
+Portions of HDF5 were developed with support from the Lawrence Berkeley
+National Laboratory (LBNL) and the United States Department of Energy
+under Prime Contract No. DE-AC02-05CH11231.
+
+-----------------------------------------------------------------------------
+
+Portions of HDF5 were developed with support from the University of
+California, Lawrence Livermore National Laboratory (UC LLNL).
+The following statement applies to those portions of the product and must
+be retained in any redistribution of source code, binaries, documentation,
+and/or accompanying materials:
+
+ This work was partially produced at the University of California,
+ Lawrence Livermore National Laboratory (UC LLNL) under contract
+ no. W-7405-ENG-48 (Contract 48) between the U.S. Department of Energy
+ (DOE) and The Regents of the University of California (University)
+ for the operation of UC LLNL.
+
+ DISCLAIMER:
+ This work was prepared as an account of work sponsored by an agency of
+ the United States Government. Neither the United States Government nor
+ the University of California nor any of their employees, makes any
+ warranty, express or implied, or assumes any liability or responsibility
+ for the accuracy, completeness, or usefulness of any information,
+ apparatus, product, or process disclosed, or represents that its use
+ would not infringe privately- owned rights. Reference herein to any
+ specific commercial products, process, or service by trade name,
+ trademark, manufacturer, or otherwise, does not necessarily constitute
+ or imply its endorsement, recommendation, or favoring by the United
+ States Government or the University of California. The views and
+ opinions of authors expressed herein do not necessarily state or reflect
+ those of the United States Government or the University of California,
+ and shall not be used for advertising or product endorsement purposes.
+
+-----------------------------------------------------------------------------
+
+HDF5 is available with the SZIP compression library but SZIP is not part
+of HDF5 and has separate copyright and license terms. See “Szip Compression
+in HDF Products” (www.hdfgroup.org/doc_resource/SZIP/) for further details.
+
+-----------------------------------------------------------------------------
+ </copyright>
+
+ <plugin
+ id="org.slf4j.api"
+ download-size="0"
+ install-size="0"
+ version="0.0.0"
+ unpack="false"/>
+
+ <plugin
+ id="hdf.hdf5lib"
+ os="win32"
+ arch="x86_64"
+ download-size="0"
+ install-size="0"
+ version="0.0.0"
+ unpack="false"/>
+
+</feature>
--- /dev/null
+<?xml version="1.0" encoding="UTF-8"?>
+<projectDescription>
+ <name>org.simantics.logging.feature</name>
+ <comment></comment>
+ <projects>
+ </projects>
+ <buildSpec>
+ <buildCommand>
+ <name>org.eclipse.pde.FeatureBuilder</name>
+ <arguments>
+ </arguments>
+ </buildCommand>
+ </buildSpec>
+ <natures>
+ <nature>org.eclipse.pde.FeatureNature</nature>
+ </natures>
+</projectDescription>
--- /dev/null
+bin.includes = feature.xml
--- /dev/null
+<?xml version="1.0" encoding="UTF-8"?>
+<feature
+ id="org.simantics.logging.feature"
+ label="Simantics Logging Feature"
+ version="1.0.0.qualifier"
+ provider-name="Semantum Oy">
+
+ <description url="http://www.example.com/description">
+ [Enter Feature Description here.]
+ </description>
+
+ <copyright url="http://www.example.com/copyright">
+ [Enter Copyright Description here.]
+ </copyright>
+
+ <license url="http://www.example.com/license">
+ [Enter License Description here.]
+ </license>
+
+ <plugin
+ id="org.simantics.logging"
+ download-size="0"
+ install-size="0"
+ version="0.0.0"
+ unpack="false"/>
+
+</feature>
--- /dev/null
+<?xml version="1.0" encoding="UTF-8"?>
+<projectDescription>
+ <name>org.simantics.logging.ui.feature</name>
+ <comment></comment>
+ <projects>
+ </projects>
+ <buildSpec>
+ <buildCommand>
+ <name>org.eclipse.pde.FeatureBuilder</name>
+ <arguments>
+ </arguments>
+ </buildCommand>
+ </buildSpec>
+ <natures>
+ <nature>org.eclipse.pde.FeatureNature</nature>
+ </natures>
+</projectDescription>
--- /dev/null
+bin.includes = feature.xml
--- /dev/null
+<?xml version="1.0" encoding="UTF-8"?>
+<feature
+ id="org.simantics.logging.ui.feature"
+ label="Simantics Logging UI Feature"
+ version="1.0.0.qualifier"
+ provider-name="Semantum Oy">
+
+ <description url="http://www.example.com/description">
+ [Enter Feature Description here.]
+ </description>
+
+ <copyright url="http://www.example.com/copyright">
+ [Enter Copyright Description here.]
+ </copyright>
+
+ <license url="http://www.example.com/license">
+ [Enter License Description here.]
+ </license>
+
+ <includes
+ id="org.simantics.logging.feature"
+ version="0.0.0"/>
+
+ <plugin
+ id="org.simantics.logging.ui"
+ download-size="0"
+ install-size="0"
+ version="0.0.0"
+ unpack="false"/>
+
+</feature>
id="org.simantics.scl.ui.feature"
version="0.0.0"/>
+ <includes
+ id="org.simantics.simulator.toolkit.feature"
+ version="0.0.0"/>
+
+ <includes
+ id="org.simantics.simulator.toolkit.db.feature"
+ version="0.0.0"/>
+
+ <includes
+ id="hdf.hdf5"
+ version="0.0.0"/>
+
+ <includes
+ id="org.simantics.logging.feature"
+ version="0.0.0"/>
+
+ <includes
+ id="org.simantics.logging.ui.feature"
+ version="0.0.0"/>
+
<plugin
id="org.simantics.fileimport"
download-size="0"
--- /dev/null
+<?xml version="1.0" encoding="UTF-8"?>\r
+<projectDescription>\r
+ <name>org.simantics.simulator.toolkit.db.feature</name>\r
+ <comment></comment>\r
+ <projects>\r
+ </projects>\r
+ <buildSpec>\r
+ <buildCommand>\r
+ <name>org.eclipse.pde.FeatureBuilder</name>\r
+ <arguments>\r
+ </arguments>\r
+ </buildCommand>\r
+ </buildSpec>\r
+ <natures>\r
+ <nature>org.eclipse.pde.FeatureNature</nature>\r
+ </natures>\r
+</projectDescription>\r
--- /dev/null
+bin.includes = feature.xml\r
--- /dev/null
+<?xml version="1.0" encoding="UTF-8"?>
+<feature
+ id="org.simantics.simulator.toolkit.db.feature"
+ label="Simulator Toolkit for Simantics Database"
+ version="1.0.0.qualifier"
+ provider-name="Semantum Oy">
+
+ <description url="http://www.example.com/description">
+ [Enter Feature Description here.]
+ </description>
+
+ <copyright url="http://www.example.com/copyright">
+ [Enter Copyright Description here.]
+ </copyright>
+
+ <license url="http://www.example.com/license">
+ [Enter License Description here.]
+ </license>
+
+ <includes
+ id="org.simantics.simulator.toolkit.feature"
+ version="0.0.0"/>
+
+ <plugin
+ id="org.simantics.simulator.toolkit.db"
+ download-size="0"
+ install-size="0"
+ version="0.0.0"
+ unpack="false"/>
+
+</feature>
--- /dev/null
+<?xml version="1.0" encoding="UTF-8"?>\r
+<projectDescription>\r
+ <name>org.simantics.simulator.toolkit.feature</name>\r
+ <comment></comment>\r
+ <projects>\r
+ </projects>\r
+ <buildSpec>\r
+ <buildCommand>\r
+ <name>org.eclipse.pde.FeatureBuilder</name>\r
+ <arguments>\r
+ </arguments>\r
+ </buildCommand>\r
+ </buildSpec>\r
+ <natures>\r
+ <nature>org.eclipse.pde.FeatureNature</nature>\r
+ </natures>\r
+</projectDescription>\r
--- /dev/null
+bin.includes = feature.xml\r
--- /dev/null
+<?xml version="1.0" encoding="UTF-8"?>
+<feature
+ id="org.simantics.simulator.toolkit.feature"
+ label="Simantics Simulator Toolkit"
+ version="1.0.0.qualifier"
+ provider-name="Semantum Oy">
+
+ <description url="http://www.example.com/description">
+ [Enter Feature Description here.]
+ </description>
+
+ <copyright url="http://www.example.com/copyright">
+ [Enter Copyright Description here.]
+ </copyright>
+
+ <license url="http://www.example.com/license">
+ [Enter License Description here.]
+ </license>
+
+ <plugin
+ id="org.simantics.simulator"
+ download-size="0"
+ install-size="0"
+ version="0.0.0"
+ unpack="false"/>
+
+ <plugin
+ id="org.simantics.simulator.toolkit"
+ download-size="0"
+ install-size="0"
+ version="0.0.0"
+ unpack="false"/>
+
+ <plugin
+ id="org.simantics.simulator.variable"
+ download-size="0"
+ install-size="0"
+ version="0.0.0"
+ unpack="false"/>
+
+ <plugin
+ id="org.simantics.databoard"
+ download-size="0"
+ install-size="0"
+ version="0.0.0"
+ unpack="false"/>
+
+ <plugin
+ id="org.simantics.simulation.sequences"
+ download-size="0"
+ install-size="0"
+ version="0.0.0"
+ unpack="false"/>
+
+ <plugin
+ id="org.simantics.scl.osgi"
+ download-size="0"
+ install-size="0"
+ version="0.0.0"
+ unpack="false"/>
+
+</feature>
id="org.simantics.scl"
version="0.0.0"/>
+ <includes
+ id="org.simantics.simulator.toolkit.db.feature"
+ version="0.0.0"/>
+
<plugin
id="org.simantics.spreadsheet.common"
download-size="0"
<modules>
<module>com.lowagie.text.feature</module>
+ <module>hdf.hdf5.feature</module>
<module>org.apache.lucene4.feature</module>
<module>org.jfree.feature</module>
<module>org.simantics.acorn.feature</module>
<module>org.simantics.issues.feature</module>
<module>org.simantics.issues.ui.feature</module>
<module>org.simantics.layer0.feature</module>
+ <module>org.simantics.logging.feature</module>
+ <module>org.simantics.logging.ui.feature</module>
<module>org.simantics.message.feature</module>
<module>org.simantics.migration.feature</module>
<module>org.simantics.modeling.feature</module>
<module>org.simantics.sdk.feature</module>
<module>org.simantics.selectionview.feature</module>
<module>org.simantics.simulation.feature</module>
+ <module>org.simantics.simulator.toolkit.feature</module>
+ <module>org.simantics.simulator.toolkit.db.feature</module>
<module>org.simantics.spreadsheet.feature</module>
<module>org.simantics.spreadsheet.ui.feature</module>
<module>org.simantics.structural.feature</module>
<li class="toclevel-1 tocsection-11"><a href="#LGPL"><span class="tocnumber">11</span> <span class="toctext">LGPL</span></a></li>\r
<li class="toclevel-1 tocsection-12"><a href="#MIT_License"><span class="tocnumber">12</span> <span class="toctext">MIT License</span></a></li>\r
<li class="toclevel-1 tocsection-13"><a href="#Mozilla_Public_License_2.0"><span class="tocnumber">13</span> <span class="toctext">Mozilla Public License 2.0</span></a></li>\r
+<li class="toclevel-1 tocsection-14"><a href="#HDF5"><span class="tocnumber">14</span> <span class="toctext">HDF5 (Hierarchical Data Format 5) License</span></a></li>\r
</ul>\r
</div>\r
\r
<td> <a href="#CDDL_1.1_.2B_GPLv2_with_classpath_exception">CDDL 1.1 + GPLv2 with classpath exception</a>\r
</td>\r
<td> 2.0.1\r
-</td></tr></tbody></table>\r
+</td></tr>\r
+<tr>\r
+<td valign="top"> <a rel="nofollow" class="external text" href="https://www.hdfgroup.org/solutions/hdf5/">HDF5 Software Library</a>\r
+</td>\r
+<td> <a href="#HDF5">HDF5 License (BSD-style Open Source)</a>\r
+</td>\r
+<td>1.10.0-patch1\r
+</td></tr>\r
+</tbody></table>\r
</blockquote>\r
\r
<!-- LICENSE TEXTS BEGIN -->\r
defined by the Mozilla Public License, v. 2.0.\r
</pre>\r
\r
+<h2><span class="mw-headline" id="HDF5">HDF5 (Hierarchical Data Format 5) Software Library and Utilities License</span></h2>\r
+<ul>\r
+<li> <a rel="nofollow" class="external free" href="https://support.hdfgroup.org/ftp/HDF5/releases/COPYING">https://support.hdfgroup.org/ftp/HDF5/releases/COPYING</a>\r
+</li>\r
+</ul>\r
+<pre>Copyright Notice and License Terms for \r
+HDF5 (Hierarchical Data Format 5) Software Library and Utilities\r
+-----------------------------------------------------------------------------\r
+\r
+HDF5 (Hierarchical Data Format 5) Software Library and Utilities\r
+Copyright 2006-2016 by The HDF Group.\r
+\r
+NCSA HDF5 (Hierarchical Data Format 5) Software Library and Utilities\r
+Copyright 1998-2006 by the Board of Trustees of the University of Illinois.\r
+\r
+All rights reserved.\r
+\r
+Redistribution and use in source and binary forms, with or without \r
+modification, are permitted for any purpose (including commercial purposes) \r
+provided that the following conditions are met:\r
+\r
+1. Redistributions of source code must retain the above copyright notice, \r
+ this list of conditions, and the following disclaimer.\r
+\r
+2. Redistributions in binary form must reproduce the above copyright notice, \r
+ this list of conditions, and the following disclaimer in the documentation \r
+ and/or materials provided with the distribution.\r
+\r
+3. In addition, redistributions of modified forms of the source or binary \r
+ code must carry prominent notices stating that the original code was \r
+ changed and the date of the change.\r
+\r
+4. All publications or advertising materials mentioning features or use of \r
+ this software are asked, but not required, to acknowledge that it was \r
+ developed by The HDF Group and by the National Center for Supercomputing \r
+ Applications at the University of Illinois at Urbana-Champaign and \r
+ credit the contributors.\r
+\r
+5. Neither the name of The HDF Group, the name of the University, nor the \r
+ name of any Contributor may be used to endorse or promote products derived \r
+ from this software without specific prior written permission from \r
+ The HDF Group, the University, or the Contributor, respectively.\r
+\r
+DISCLAIMER: \r
+THIS SOFTWARE IS PROVIDED BY THE HDF GROUP AND THE CONTRIBUTORS \r
+"AS IS" WITH NO WARRANTY OF ANY KIND, EITHER EXPRESSED OR IMPLIED. In no \r
+event shall The HDF Group or the Contributors be liable for any damages \r
+suffered by the users arising out of the use of this software, even if \r
+advised of the possibility of such damage. \r
+\r
+-----------------------------------------------------------------------------\r
+-----------------------------------------------------------------------------\r
+\r
+Contributors: National Center for Supercomputing Applications (NCSA) at \r
+the University of Illinois, Fortner Software, Unidata Program Center (netCDF), \r
+The Independent JPEG Group (JPEG), Jean-loup Gailly and Mark Adler (gzip), \r
+and Digital Equipment Corporation (DEC).\r
+\r
+-----------------------------------------------------------------------------\r
+\r
+Portions of HDF5 were developed with support from the Lawrence Berkeley \r
+National Laboratory (LBNL) and the United States Department of Energy \r
+under Prime Contract No. DE-AC02-05CH11231.\r
+\r
+-----------------------------------------------------------------------------\r
+\r
+Portions of HDF5 were developed with support from the University of \r
+California, Lawrence Livermore National Laboratory (UC LLNL). \r
+The following statement applies to those portions of the product and must \r
+be retained in any redistribution of source code, binaries, documentation, \r
+and/or accompanying materials:\r
+\r
+ This work was partially produced at the University of California, \r
+ Lawrence Livermore National Laboratory (UC LLNL) under contract \r
+ no. W-7405-ENG-48 (Contract 48) between the U.S. Department of Energy \r
+ (DOE) and The Regents of the University of California (University) \r
+ for the operation of UC LLNL.\r
+\r
+ DISCLAIMER: \r
+ This work was prepared as an account of work sponsored by an agency of \r
+ the United States Government. Neither the United States Government nor \r
+ the University of California nor any of their employees, makes any \r
+ warranty, express or implied, or assumes any liability or responsibility \r
+ for the accuracy, completeness, or usefulness of any information, \r
+ apparatus, product, or process disclosed, or represents that its use \r
+ would not infringe privately- owned rights. Reference herein to any \r
+ specific commercial products, process, or service by trade name, \r
+ trademark, manufacturer, or otherwise, does not necessarily constitute \r
+ or imply its endorsement, recommendation, or favoring by the United \r
+ States Government or the University of California. The views and \r
+ opinions of authors expressed herein do not necessarily state or reflect \r
+ those of the United States Government or the University of California, \r
+ and shall not be used for advertising or product endorsement purposes.\r
+\r
+-----------------------------------------------------------------------------\r
+\r
+HDF5 is available with the SZIP compression library but SZIP is not part \r
+of HDF5 and has separate copyright and license terms. See "Szip Compression \r
+in HDF Products" (www.hdfgroup.org/doc_resource/SZIP/) for further details.\r
+\r
+-----------------------------------------------------------------------------\r
+</pre>\r
+\r
</div>\r
</div>\r
</body>\r
<modules>
<module>org.simantics.sdk.build.targetdefinition</module>
- <module>org.simantics.sdk.repository</module>
- <module>org.simantics.desktop.rcp.product</module>
</modules>
+
+ <profiles>
+ <profile>
+ <id>build-p2-repository</id>
+ <activation>
+ <property>
+ <name>build-p2-repository</name>
+ <value>!false</value>
+ </property>
+ </activation>
+ <modules>
+ <module>org.simantics.sdk.repository</module>
+ </modules>
+ </profile>
+ <profile>
+ <id>build-products</id>
+ <activation>
+ <property>
+ <name>build-products</name>
+ <value>!false</value>
+ </property>
+ </activation>
+ <modules>
+ <module>org.simantics.desktop.rcp.product</module>
+ </modules>
+ </profile>
+ </profiles>
+
</project>