Abstract:
Ever since the invention of the first computers, technologies have continuously inspected ways for achieving better system performance. The studies have led to outstanding hardware and software advancements that have resulted in much faster and much smaller machines. Meanwhile, programming approaches and algorithms have evolved at the same pace, in order to facilitate the optimum utilization of the high-performance capacities of their contemporary systems. These solutions generally conceal complexities by building higher-level abstraction layers on top of the native environments of popular programming languages. However, their higher levels of abstraction expose them to risks such as incompatibility, verbosity and lack of programmer control over parallelization details. In addition, the new concepts and syntax added by these frameworks make parallel programs more complex and confusing to implement. This research investigates a Java-based parallel programming environment called Annotation Parallel Task (@PT) that utilizes the native Java annotations for its abstract programming interface (API). @PT employs the native annotations for collecting metadata from native code without adding a new abstraction level to that of Java. Instead, the framework exploits techniques for inferring parallelization semantics from simple sequential code, in order to avoid adding new syntax and grammar to the conventional Java language. Rather, the inference concepts and the underlying parallel processing runtime of @PT work together to convert annotated sequential code into parallelized versions that are executed efficiently. Following its main objective, the work inspects concepts for combining different high-performance techniques of shared-memory parallel computing and distributed-memory parallel computing, namely cloud environments, under unified concepts. The result is an unobtrusive approach that seamlessly integrates various parallel-programming mechanisms within the simplified structure of native object-oriented Java programs.