Hibernate学习之HQL查询

本文介绍了Hibernate的HQL查询语言,包括查询条件设定、投影、分页、链接、分组等功能。通过实例展示了如何使用HQL进行数据库操作,如查询、投影、分页等,并提到了Oracle数据库的配置和远程访问问题。

摘要生成于 C知道 ,由 DeepSeek-R1 满血版支持, 前往体验 >

原文地址:https://github.com/mageSFC/myblog/blob/master/Hibernate%E5%AD%A6%E4%B9%A0%E4%B9%8BHQL%E6%9F%A5%E8%AF%A2.md

Hibernate学习之HQL查询
马哥私房菜博客地址:https://github.com/mageSFC/myblog

HQL(hibernate query language)是面向对象的查询语言,他和sql查询语言有些相似。
在hibernate提供的各种检索方式中,hql是使用最广泛的一种检索方式,它具有如下功能:
1.在查询语句中设定各种查询条件
2.支持投影查询,即仅检索出对象的部分属性
3.支持分页查询
4.支持链接查询
5.支持分组查询,允许使用having和group by关键字
6.提供内置聚集函数,如sum(),min(),max()
7.支持子查询
8.支持动态绑定参数
9.能够调用用户定义的sql函数或者标准的sql函数

下面的例子我们采用了oracle数据库

oracle数据库安装在了windows上面。

马哥私房菜博客地址:https://github.com/mageSFC/myblog

马哥私房菜博客地址:https://github.com/mageSFC/myblog
oracle远程访问开启:

解决方法:

查看端口状态:CMD -> netstat -a -n
显示的结果是:1521端口对应的本地地址栏为:127.0.0.0:1521,
此时,修改Oracle安装目录下dbhome_1\NETWORK\ADMIN\listener.ora文件(或是PLSQL下的对应文件instantclient_11_2\listener.ora)的HOST值,改为0.0.0.0。
重启Oracle监听服务;
再次通过netstat查看端口信息,显示:1521端口对应的本地地址栏为:0.0.0.0:1521
这时通过127.0.0.1或loaclhost或主机名或本机ip都可telnet通过。

在Windows系统下完成Oracle安装后,在其防火墙设置中开放1521端口(Oracle默认的侦听端口)。若客户端仍然无法访问,则需要作进一步的设置,即在注册表“HKEY_LOCAL_MACHINE” - “Software” - “ORACLE” - “HOME”下添加一个注册表项“USE_SHARED_SOCKED”,并将其值设为TRUE,然后重启Oracle服务及Listener服务。

马哥私房菜博客地址:https://github.com/mageSFC/myblog

总结:
Oracle Telnet 1521失败,要检查以下几点:
1、防火墙是否开启,若开启,是否有对1521端口开启;
2、listener.ora文件的HOST值。

注:10.2以上,USE_SHARDED_SOCKET就已经是默认值为TRUE了,无需再修改。

最后通过nmap命令可以看到1521端口是open的状态表明可以了

$ nmap 10.0.63.42                                                                                                                           

Starting Nmap 7.01 ( https://nmap.org ) at 2017-12-22 16:01 CST
Nmap scan report for 10.0.63.42
Host is up (0.88s latency).
Not shown: 987 closed ports
PORT      STATE SERVICE
1521/tcp  open  oracle
马哥私房菜博客地址:https://github.com/mageSFC/myblog 
马哥私房菜博客地址:https://github.com/mageSFC/myblog 
Nmap done: 1 IP address (1 host up) scanned in 5.13 seconds

添加oracle的jdbc驱动文件

把从官网下载的oracle的jdbc驱动jar包放到家目录下面执行下面命令:
$ mvn install:install-file -Dfile=ojdbc8.jar -DgroupId=com.oracle -DartifactId=ojdbc8 -Dversion=12 -Dpackaging=jar
[INFO] Scanning for projects...
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building Maven Stub Project (No POM) 1
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- maven-install-plugin:2.5.2:install-file (default-cli) @ standalone-pom ---
[INFO] pom.xml not found in ojdbc8.jar
[INFO] Installing /home/mamh/ojdbc8.jar to /home/mamh/.m2/repository/com/oracle/ojdbc8/12/ojdbc8-12.jar
[INFO] Installing /tmp/mvninstall5920371845117601504.pom to /home/mamh/.m2/repository/com/oracle/ojdbc8/12/ojdbc8-12.pom
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 0.332 s
[INFO] Finished at: 2017-12-22T15:54:02+08:00
[INFO] Final Memory: 8M/303M
[INFO] ------------------------------------------------------------------------




$ cat /home/mamh/.m2/repository/com/oracle/ojdbc8/12/ojdbc8-12.pom                                                                          
<?xml version="1.0" encoding="UTF-8"?>
<project xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd" xmlns="http://maven.apache.org/POM/4.0.0"
    xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
  <modelVersion>4.0.0</modelVersion>
  <groupId>com.oracle</groupId>
  <artifactId>ojdbc8</artifactId>
  <version>12</version>
  <description>POM was created from install:install-file</description>
</project>

最后pom.xml中添加如下:
<dependency>
    <groupId>com.oracle</groupId>
    <artifactId>ojdbc8</artifactId>
    <version>12</version>
</dependency>


马哥私房菜博客地址:https://github.com/mageSFC/myblog 

这里我们给出一个测试使用的数据库表,包含数据的
马哥私房菜博客地址:https://github.com/mageSFC/myblog

EMPLOYEE表格#EMPLOYEE_IDFIRST_NAMELAST_NAMEEMAILPHONE_NUMBERHIRE_DATEJOB_IDSALARYCOMMISSION_PCTMANAGER_IDDEPARTMENT_ID
1100StevenKingSKING515.123.45671987-06-17 00:00:00AD_PRES24000.00NULLNULL90
2101NeenaKochharNKOCHHAR515.123.45681989-09-21 00:00:00AD_VP17000.00NULL10090
3102LexDe HaanLDEHAAN515.123.45691993-01-13 00:00:00AD_VP17000.00NULL10090
4103AlexanderHunoldAHUNOLD590.423.45671990-01-03 00:00:00IT_PROG9000.00NULL10260
5104BruceErnstBERNST590.423.45681991-05-21 00:00:00IT_PROG6000.00NULL10360
6105DavidAustinDAUSTIN590.423.45691997-06-25 00:00:00IT_PROG4800.00NULL10360
7106ValliPataballaVPATABAL590.423.45601998-02-05 00:00:00IT_PROG4800.00NULL10360
8107DianaLorentzDLORENTZ590.423.55671999-02-07 00:00:00IT_PROG4200.00NULL10360
9108NancyGreenbergNGREENBE515.124.45691994-08-17 00:00:00FI_MGR12000.00NULL101100
10109DanielFavietDFAVIET515.124.41691994-08-16 00:00:00FI_ACCOUNT9000.00NULL108100
11110JohnChenJCHEN515.124.42691997-09-28 00:00:00FI_ACCOUNT8200.00NULL108100
12111IsmaelSciarraISCIARRA515.124.43691997-09-30 00:00:00FI_ACCOUNT7700.00NULL108100
13112Jose ManuelUrmanJMURMAN515.124.44691998-03-07 00:00:00FI_ACCOUNT7800.00NULL108100
14113LuisPoppLPOPP515.124.45671999-12-07 00:00:00FI_ACCOUNT6900.00NULL108100
15114DenRaphaelyDRAPHEAL515.127.45611994-12-07 00:00:00PU_MAN11000.00NULL10030
16115AlexanderKhooAKHOO515.127.45621995-05-18 00:00:00PU_CLERK3100.00NULL11430
17116ShelliBaidaSBAIDA515.127.45631997-12-24 00:00:00PU_CLERK2900.00NULL11430
18117SigalTobiasSTOBIAS515.127.45641997-07-24 00:00:00PU_CLERK2800.00NULL11430
19118GuyHimuroGHIMURO515.127.45651998-11-15 00:00:00PU_CLERK2600.00NULL11430
20119KarenColmenaresKCOLMENA515.127.45661999-08-10 00:00:00PU_CLERK2500.00NULL11430
21120MatthewWeissMWEISS650.123.12341996-07-18 00:00:00ST_MAN8000.00NULL10050
22121AdamFrippAFRIPP650.123.22341997-04-10 00:00:00ST_MAN8200.00NULL10050
23122PayamKauflingPKAUFLIN650.123.32341995-05-01 00:00:00ST_MAN7900.00NULL10050
24123ShantaVollmanSVOLLMAN650.123.42341997-10-10 00:00:00ST_MAN6500.00NULL10050
25124KevinMourgosKMOURGOS650.123.52341999-11-16 00:00:00ST_MAN5800.00NULL10050
26125JuliaNayerJNAYER650.124.12141997-07-16 00:00:00ST_CLERK3200.00NULL12050
27126IreneMikkilineniIMIKKILI650.124.12241998-09-28 00:00:00ST_CLERK2700.00NULL12050
28127JamesLandryJLANDRY650.124.13341999-01-14 00:00:00ST_CLERK2400.00NULL12050
29128StevenMarkleSMARKLE650.124.14342000-03-08 00:00:00ST_CLERK2200.00NULL12050
30129LauraBissotLBISSOT650.124.52341997-08-20 00:00:00ST_CLERK3300.00NULL12150
31130MozheAtkinsonMATKINSO650.124.62341997-10-30 00:00:00ST_CLERK2800.00NULL12150
32131JamesMarlowJAMRLOW650.124.72341997-02-16 00:00:00ST_CLERK2500.00NULL12150
33132TJOlsonTJOLSON650.124.82341999-04-10 00:00:00ST_CLERK2100.00NULL12150
34133JasonMallinJMALLIN650.127.19341996-06-14 00:00:00ST_CLERK3300.00NULL12250
35134MichaelRogersMROGERS650.127.18341998-08-26 00:00:00ST_CLERK2900.00NULL12250
36135KiGeeKGEE650.127.17341999-12-12 00:00:00ST_CLERK2400.00NULL12250
37136HazelPhiltankerHPHILTAN650.127.16342000-02-06 00:00:00ST_CLERK2200.00NULL12250
38137RenskeLadwigRLADWIG650.121.12341995-07-14 00:00:00ST_CLERK3600.00NULL12350
39138StephenStilesSSTILES650.121.20341997-10-26 00:00:00ST_CLERK3200.00NULL12350
40139JohnSeoJSEO650.121.20191998-02-12 00:00:00ST_CLERK2700.00NULL12350
41140JoshuaPatelJPATEL650.121.18341998-04-06 00:00:00ST_CLERK2500.00NULL12350
42141TrennaRajsTRAJS650.121.80091995-10-17 00:00:00ST_CLERK3500.00NULL12450
43142CurtisDaviesCDAVIES650.121.29941997-01-29 00:00:00ST_CLERK3100.00NULL12450
44143RandallMatosRMATOS650.121.28741998-03-15 00:00:00ST_CLERK2600.00NULL12450
45144PeterVargasPVARGAS650.121.20041998-07-09 00:00:00ST_CLERK2500.00NULL12450
46145JohnRussellJRUSSEL011.44.1344.4292681996-10-01 00:00:00SA_MAN14000.000.4010080
47146KarenPartnersKPARTNER011.44.1344.4672681997-01-05 00:00:00SA_MAN13500.000.3010080
48147AlbertoErrazurizAERRAZUR011.44.1344.4292781997-03-10 00:00:00SA_MAN12000.000.3010080
49148GeraldCambraultGCAMBRAU011.44.1344.6192681999-10-15 00:00:00SA_MAN11000.000.3010080
50149EleniZlotkeyEZLOTKEY011.44.1344.4290182000-01-29 00:00:00SA_MAN10500.000.2010080
51150PeterTuckerPTUCKER011.44.1344.1292681997-01-30 00:00:00SA_REP10000.000.3014580
52151DavidBernsteinDBERNSTE011.44.1344.3452681997-03-24 00:00:00SA_REP9500.000.2514580
53152PeterHallPHALL011.44.1344.4789681997-08-20 00:00:00SA_REP9000.000.2514580
54153ChristopherOlsenCOLSEN011.44.1344.4987181998-03-30 00:00:00SA_REP8000.000.2014580
55154NanetteCambraultNCAMBRAU011.44.1344.9876681998-12-09 00:00:00SA_REP7500.000.2014580
56155OliverTuvaultOTUVAULT011.44.1344.4865081999-11-23 00:00:00SA_REP7000.000.1514580
57156JanetteKingJKING011.44.1345.4292681996-01-30 00:00:00SA_REP10000.000.3514680
58157PatrickSullyPSULLY011.44.1345.9292681996-03-04 00:00:00SA_REP9500.000.3514680
59158AllanMcEwenAMCEWEN011.44.1345.8292681996-08-01 00:00:00SA_REP9000.000.3514680
60159LindseySmithLSMITH011.44.1345.7292681997-03-10 00:00:00SA_REP8000.000.3014680
61160LouiseDoranLDORAN011.44.1345.6292681997-12-15 00:00:00SA_REP7500.000.3014680
62161SarathSewallSSEWALL011.44.1345.5292681998-11-03 00:00:00SA_REP7000.000.2514680
63162ClaraVishneyCVISHNEY011.44.1346.1292681997-11-11 00:00:00SA_REP10500.000.2514780
64163DanielleGreeneDGREENE011.44.1346.2292681999-03-19 00:00:00SA_REP9500.000.1514780
65164MatteaMarvinsMMARVINS011.44.1346.3292682000-01-24 00:00:00SA_REP7200.000.1014780
66165DavidLeeDLEE011.44.1346.5292682000-02-23 00:00:00SA_REP6800.000.1014780
67166SundarAndeSANDE011.44.1346.6292682000-03-24 00:00:00SA_REP6400.000.1014780
68167AmitBandaABANDA011.44.1346.7292682000-04-21 00:00:00SA_REP6200.000.1014780
69168LisaOzerLOZER011.44.1343.9292681997-03-11 00:00:00SA_REP11500.000.2514880
70169HarrisonBloomHBLOOM011.44.1343.8292681998-03-23 00:00:00SA_REP10000.000.2014880
71170TaylerFoxTFOX011.44.1343.7292681998-01-24 00:00:00SA_REP9600.000.2014880
72171WilliamSmithWSMITH011.44.1343.6292681999-02-23 00:00:00SA_REP7400.000.1514880
73172ElizabethBatesEBATES011.44.1343.5292681999-03-24 00:00:00SA_REP7300.000.1514880
74173SunditaKumarSKUMAR011.44.1343.3292682000-04-21 00:00:00SA_REP6100.000.1014880
75174EllenAbelEABEL011.44.1644.4292671996-05-11 00:00:00SA_REP11000.000.3014980
76175AlyssaHuttonAHUTTON011.44.1644.4292661997-03-19 00:00:00SA_REP8800.000.2514980
77176JonathonTaylorJTAYLOR011.44.1644.4292651998-03-24 00:00:00SA_REP8600.000.2014980
78177JackLivingstonJLIVINGS011.44.1644.4292641998-04-23 00:00:00SA_REP8400.000.2014980
79178KimberelyGrantKGRANT011.44.1644.4292631999-05-24 00:00:00SA_REP7000.000.15149NULL
80179CharlesJohnsonCJOHNSON011.44.1644.4292622000-01-04 00:00:00SA_REP6200.000.1014980
81180WinstonTaylorWTAYLOR650.507.98761998-01-24 00:00:00SH_CLERK3200.00NULL12050
82181JeanFleaurJFLEAUR650.507.98771998-02-23 00:00:00SH_CLERK3100.00NULL12050
83182MarthaSullivanMSULLIVA650.507.98781999-06-21 00:00:00SH_CLERK2500.00NULL12050
84183GirardGeoniGGEONI650.507.98792000-02-03 00:00:00SH_CLERK2800.00NULL12050
85184NanditaSarchandNSARCHAN650.509.18761996-01-27 00:00:00SH_CLERK4200.00NULL12150
86185AlexisBullABULL650.509.28761997-02-20 00:00:00SH_CLERK4100.00NULL12150
87186JuliaDellingerJDELLING650.509.38761998-06-24 00:00:00SH_CLERK3400.00NULL12150
88187AnthonyCabrioACABRIO650.509.48761999-02-07 00:00:00SH_CLERK3000.00NULL12150
89188KellyChungKCHUNG650.505.18761997-06-14 00:00:00SH_CLERK3800.00NULL12250
90189JenniferDillyJDILLY650.505.28761997-08-13 00:00:00SH_CLERK3600.00NULL12250
91190TimothyGatesTGATES650.505.38761998-07-11 00:00:00SH_CLERK2900.00NULL12250
92191RandallPerkinsRPERKINS650.505.48761999-12-19 00:00:00SH_CLERK2500.00NULL12250
93192SarahBellSBELL650.501.18761996-02-04 00:00:00SH_CLERK4000.00NULL12350
94193BritneyEverettBEVERETT650.501.28761997-03-03 00:00:00SH_CLERK3900.00NULL12350
95194SamuelMcCainSMCCAIN650.501.38761998-07-01 00:00:00SH_CLERK3200.00NULL12350
96195VanceJonesVJONES650.501.48761999-03-17 00:00:00SH_CLERK2800.00NULL12350
97196AlanaWalshAWALSH650.507.98111998-04-24 00:00:00SH_CLERK3100.00NULL12450
98197KevinFeeneyKFEENEY650.507.98221998-05-23 00:00:00SH_CLERK3000.00NULL12450
99198DonaldOConnellDOCONNEL650.507.98331999-06-21 00:00:00SH_CLERK2600.00NULL12450
100199DouglasGrantDGRANT650.507.98442000-01-13 00:00:00SH_CLERK2600.00NULL12450
101200JenniferWhalenJWHALEN515.123.44441987-09-17 00:00:00AD_ASST4400.00NULL10110
102201MichaelHartsteinMHARTSTE515.123.55551996-02-17 00:00:00MK_MAN13000.00NULL10020
103202PatFayPFAY603.123.66661997-08-17 00:00:00MK_REP6000.00NULL20120
104203SusanMavrisSMAVRIS515.123.77771994-06-07 00:00:00HR_REP6500.00NULL10140
105204HermannBaerHBAER515.123.88881994-06-07 00:00:00PR_REP10000.00NULL10170
106205ShelleyHigginsSHIGGINS515.123.80801994-06-07 00:00:00AC_MGR12000.00NULL101110
107206WilliamGietzWGIETZ515.123.81811994-06-07 00:00:00AC_ACCOUNT8300.00NULL205110

马哥私房菜博客地址:https://github.com/mageSFC/myblog


DEPARTMENT表格DEPARTMENT_IDDEPARTMENT_NAMEMANAGER_IDLOCATION_ID
110Administration2001700
220Marketing2011800
330Purchasing1141700
440Human Resources2032400
550Shipping1211500
660IT1031400
770Public Relations2042700
880Sales1452500
990Executive1001700
10100Finance1081700
11110Accounting2051700
12120TreasuryNULL1700
13130Corporate TaxNULL1700
14140Control And CreditNULL1700
15150Shareholder ServicesNULL1700
16160BenefitsNULL1700
17170ManufacturingNULL1700
18180ConstructionNULL1700
19190ContractingNULL1700
20200OperationsNULL1700
21210IT SupportNULL1700
22220NOCNULL1700
23230IT HelpdeskNULL1700
24240Government SalesNULL1700
25250Retail SalesNULL1700
26260RecruitingNULL1700
27270PayrollNULL1700

马哥私房菜博客地址:https://github.com/mageSFC/myblog


SET VERIFY OFF
ALTER SESSION SET NLS_LANGUAGE=American; 

REM ***************************insert data into the REGIONS table

Prompt ******  Populating REGIONS table ....

INSERT INTO regions VALUES 
        ( 1
        , 'Europe' 
        );

INSERT INTO regions VALUES 
        ( 2
        , 'Americas' 
        );

INSERT INTO regions VALUES 
        ( 3
        , 'Asia' 
        );

INSERT INTO regions VALUES 
        ( 4
        , 'Middle East and Africa' 
        );

REM ***************************insert data into the COUNTRIES table

Prompt ******  Populating COUNTIRES table ....

INSERT INTO countries VALUES 
        ( 'IT'
        , 'Italy'
        , 1 
        );

INSERT INTO countries VALUES 
        ( 'JP'
        , 'Japan'
    , 3 
        );

INSERT INTO countries VALUES 
        ( 'US'
        , 'United States of America'
        , 2 
        );

INSERT INTO countries VALUES 
        ( 'CA'
        , 'Canada'
        , 2 
        );

INSERT INTO countries VALUES 
        ( 'CN'
        , 'China'
        , 3 
        );

INSERT INTO countries VALUES 
        ( 'IN'
        , 'India'
        , 3 
        );

INSERT INTO countries VALUES 
        ( 'AU'
        , 'Australia'
        , 3 
        );

INSERT INTO countries VALUES 
        ( 'ZW'
        , 'Zimbabwe'
        , 4 
        );

INSERT INTO countries VALUES 
        ( 'SG'
        , 'Singapore'
        , 3 
        );

INSERT INTO countries VALUES 
        ( 'UK'
        , 'United Kingdom'
        , 1 
        );

INSERT INTO countries VALUES 
        ( 'FR'
        , 'France'
        , 1 
        );

INSERT INTO countries VALUES 
        ( 'DE'
        , 'Germany'
        , 1 
        );

INSERT INTO countries VALUES 
        ( 'ZM'
        , 'Zambia'
        , 4 
        );

INSERT INTO countries VALUES 
        ( 'EG'
        , 'Egypt'
        , 4 
        );

INSERT INTO countries VALUES 
        ( 'BR'
        , 'Brazil'
        , 2 
        );

INSERT INTO countries VALUES 
        ( 'CH'
        , 'Switzerland'
        , 1 
        );

INSERT INTO countries VALUES 
        ( 'NL'
        , 'Netherlands'
        , 1 
        );

INSERT INTO countries VALUES 
        ( 'MX'
        , 'Mexico'
        , 2 
        );

INSERT INTO countries VALUES 
        ( 'KW'
        , 'Kuwait'
        , 4 
        );

INSERT INTO countries VALUES 
        ( 'IL'
        , 'Israel'
        , 4 
        );

INSERT INTO countries VALUES 
        ( 'DK'
        , 'Denmark'
        , 1 
        );

INSERT INTO countries VALUES 
        ( 'HK'
        , 'HongKong'
        , 3 
        );

INSERT INTO countries VALUES 
        ( 'NG'
        , 'Nigeria'
        , 4 
        );

INSERT INTO countries VALUES 
        ( 'AR'
        , 'Argentina'
        , 2 
        );

INSERT INTO countries VALUES 
        ( 'BE'
        , 'Belgium'
        , 1 
        );


REM ***************************insert data into the LOCATIONS table

Prompt ******  Populating LOCATIONS table ....

INSERT INTO locations VALUES 
        ( 1000 
        , '1297 Via Cola di Rie'
        , '00989'
        , 'Roma'
        , NULL
        , 'IT'
        );

INSERT INTO locations VALUES 
        ( 1100 
        , '93091 Calle della Testa'
        , '10934'
        , 'Venice'
        , NULL
        , 'IT'
        );

INSERT INTO locations VALUES 
        ( 1200 
        , '2017 Shinjuku-ku'
        , '1689'
        , 'Tokyo'
        , 'Tokyo Prefecture'
        , 'JP'
        );

INSERT INTO locations VALUES 
        ( 1300 
        , '9450 Kamiya-cho'
        , '6823'
        , 'Hiroshima'
        , NULL
        , 'JP'
        );

INSERT INTO locations VALUES 
        ( 1400 
        , '2014 Jabberwocky Rd'
        , '26192'
        , 'Southlake'
        , 'Texas'
        , 'US'
        );

INSERT INTO locations VALUES 
        ( 1500 
        , '2011 Interiors Blvd'
        , '99236'
        , 'South San Francisco'
        , 'California'
        , 'US'
        );

INSERT INTO locations VALUES 
        ( 1600 
        , '2007 Zagora St'
        , '50090'
        , 'South Brunswick'
        , 'New Jersey'
        , 'US'
        );

INSERT INTO locations VALUES 
        ( 1700 
        , '2004 Charade Rd'
        , '98199'
        , 'Seattle'
        , 'Washington'
        , 'US'
        );

INSERT INTO locations VALUES 
        ( 1800 
        , '147 Spadina Ave'
        , 'M5V 2L7'
        , 'Toronto'
        , 'Ontario'
        , 'CA'
        );

INSERT INTO locations VALUES 
        ( 1900 
        , '6092 Boxwood St'
        , 'YSW 9T2'
        , 'Whitehorse'
        , 'Yukon'
        , 'CA'
        );

INSERT INTO locations VALUES 
        ( 2000 
        , '40-5-12 Laogianggen'
        , '190518'
        , 'Beijing'
        , NULL
        , 'CN'
        );

INSERT INTO locations VALUES 
        ( 2100 
        , '1298 Vileparle (E)'
        , '490231'
        , 'Bombay'
        , 'Maharashtra'
        , 'IN'
        );

INSERT INTO locations VALUES 
        ( 2200 
        , '12-98 Victoria Street'
        , '2901'
        , 'Sydney'
        , 'New South Wales'
        , 'AU'
        );

INSERT INTO locations VALUES 
        ( 2300 
        , '198 Clementi North'
        , '540198'
        , 'Singapore'
        , NULL
        , 'SG'
        );

INSERT INTO locations VALUES 
        ( 2400 
        , '8204 Arthur St'
        , NULL
        , 'London'
        , NULL
        , 'UK'
        );

INSERT INTO locations VALUES 
        ( 2500 
        , 'Magdalen Centre, The Oxford Science Park'
        , 'OX9 9ZB'
        , 'Oxford'
        , 'Oxford'
        , 'UK'
        );

INSERT INTO locations VALUES 
        ( 2600 
        , '9702 Chester Road'
        , '09629850293'
        , 'Stretford'
        , 'Manchester'
        , 'UK'
        );

INSERT INTO locations VALUES 
        ( 2700 
        , 'Schwanthalerstr. 7031'
        , '80925'
        , 'Munich'
        , 'Bavaria'
        , 'DE'
        );

INSERT INTO locations VALUES 
        ( 2800 
        , 'Rua Frei Caneca 1360 '
        , '01307-002'
        , 'Sao Paulo'
        , 'Sao Paulo'
        , 'BR'
        );

INSERT INTO locations VALUES 
        ( 2900 
        , '20 Rue des Corps-Saints'
        , '1730'
        , 'Geneva'
        , 'Geneve'
        , 'CH'
        );

INSERT INTO locations VALUES 
        ( 3000 
        , 'Murtenstrasse 921'
        , '3095'
        , 'Bern'
        , 'BE'
        , 'CH'
        );

INSERT INTO locations VALUES 
        ( 3100 
        , 'Pieter Breughelstraat 837'
        , '3029SK'
        , 'Utrecht'
        , 'Utrecht'
        , 'NL'
        );

INSERT INTO locations VALUES 
        ( 3200 
        , 'Mariano Escobedo 9991'
        , '11932'
        , 'Mexico City'
        , 'Distrito Federal,'
        , 'MX'
        );


REM ****************************insert data into the DEPARTMENTS table

Prompt ******  Populating DEPARTMENTS table ....

REM disable integrity constraint to EMPLOYEES to load data

ALTER TABLE departments 
  DISABLE CONSTRAINT dept_mgr_fk;

INSERT INTO departments VALUES 
        ( 10
        , 'Administration'
        , 200
        , 1700
        );

INSERT INTO departments VALUES 
        ( 20
        , 'Marketing'
        , 201
        , 1800
        );

INSERT INTO departments VALUES 
        ( 30
        , 'Purchasing'
        , 114
        , 1700
    );

INSERT INTO departments VALUES 
        ( 40
        , 'Human Resources'
        , 203
        , 2400
        );

INSERT INTO departments VALUES 
        ( 50
        , 'Shipping'
        , 121
        , 1500
        );

INSERT INTO departments VALUES 
        ( 60 
        , 'IT'
        , 103
        , 1400
        );

INSERT INTO departments VALUES 
        ( 70 
        , 'Public Relations'
        , 204
        , 2700
        );

INSERT INTO departments VALUES 
        ( 80 
        , 'Sales'
        , 145
        , 2500
        );

INSERT INTO departments VALUES 
        ( 90 
        , 'Executive'
        , 100
        , 1700
        );

INSERT INTO departments VALUES 
        ( 100 
        , 'Finance'
        , 108
        , 1700
        );

INSERT INTO departments VALUES 
        ( 110 
        , 'Accounting'
        , 205
        , 1700
        );

INSERT INTO departments VALUES 
        ( 120 
        , 'Treasury'
        , NULL
        , 1700
        );

INSERT INTO departments VALUES 
        ( 130 
        , 'Corporate Tax'
        , NULL
        , 1700
        );

INSERT INTO departments VALUES 
        ( 140 
        , 'Control And Credit'
        , NULL
        , 1700
        );

INSERT INTO departments VALUES 
        ( 150 
        , 'Shareholder Services'
        , NULL
        , 1700
        );

INSERT INTO departments VALUES 
        ( 160 
        , 'Benefits'
        , NULL
        , 1700
        );

INSERT INTO departments VALUES 
        ( 170 
        , 'Manufacturing'
        , NULL
        , 1700
        );

INSERT INTO departments VALUES 
        ( 180 
        , 'Construction'
        , NULL
        , 1700
        );

INSERT INTO departments VALUES 
        ( 190 
        , 'Contracting'
        , NULL
        , 1700
        );

INSERT INTO departments VALUES 
        ( 200 
        , 'Operations'
        , NULL
        , 1700
        );

INSERT INTO departments VALUES 
        ( 210 
        , 'IT Support'
        , NULL
        , 1700
        );

INSERT INTO departments VALUES 
        ( 220 
        , 'NOC'
        , NULL
        , 1700
        );

INSERT INTO departments VALUES 
        ( 230 
        , 'IT Helpdesk'
        , NULL
        , 1700
        );

INSERT INTO departments VALUES 
        ( 240 
        , 'Government Sales'
        , NULL
        , 1700
        );

INSERT INTO departments VALUES 
        ( 250 
        , 'Retail Sales'
        , NULL
        , 1700
        );

INSERT INTO departments VALUES 
        ( 260 
        , 'Recruiting'
        , NULL
        , 1700
        );

INSERT INTO departments VALUES 
        ( 270 
        , 'Payroll'
        , NULL
        , 1700
        );


REM ***************************insert data into the JOBS table

Prompt ******  Populating JOBS table ....

INSERT INTO jobs VALUES 
        ( 'AD_PRES'
        , 'President'
        , 20000
        , 40000
        );
INSERT INTO jobs VALUES 
        ( 'AD_VP'
        , 'Administration Vice President'
        , 15000
        , 30000
        );

INSERT INTO jobs VALUES 
        ( 'AD_ASST'
        , 'Administration Assistant'
        , 3000
        , 6000
        );

INSERT INTO jobs VALUES 
        ( 'FI_MGR'
        , 'Finance Manager'
        , 8200
        , 16000
        );

INSERT INTO jobs VALUES 
        ( 'FI_ACCOUNT'
        , 'Accountant'
        , 4200
        , 9000
        );

INSERT INTO jobs VALUES 
        ( 'AC_MGR'
        , 'Accounting Manager'
        , 8200
        , 16000
        );

INSERT INTO jobs VALUES 
        ( 'AC_ACCOUNT'
        , 'Public Accountant'
        , 4200
        , 9000
        );
INSERT INTO jobs VALUES 
        ( 'SA_MAN'
        , 'Sales Manager'
        , 10000
        , 20000
        );

INSERT INTO jobs VALUES 
        ( 'SA_REP'
        , 'Sales Representative'
        , 6000
        , 12000
        );

INSERT INTO jobs VALUES 
        ( 'PU_MAN'
        , 'Purchasing Manager'
        , 8000
        , 15000
        );

INSERT INTO jobs VALUES 
        ( 'PU_CLERK'
        , 'Purchasing Clerk'
        , 2500
        , 5500
        );

INSERT INTO jobs VALUES 
        ( 'ST_MAN'
        , 'Stock Manager'
        , 5500
        , 8500
        );
INSERT INTO jobs VALUES 
        ( 'ST_CLERK'
        , 'Stock Clerk'
        , 2000
        , 5000
        );

INSERT INTO jobs VALUES 
        ( 'SH_CLERK'
        , 'Shipping Clerk'
        , 2500
        , 5500
        );

INSERT INTO jobs VALUES 
        ( 'IT_PROG'
        , 'Programmer'
        , 4000
        , 10000
        );

INSERT INTO jobs VALUES 
        ( 'MK_MAN'
        , 'Marketing Manager'
        , 9000
        , 15000
        );

INSERT INTO jobs VALUES 
        ( 'MK_REP'
        , 'Marketing Representative'
        , 4000
        , 9000
        );

INSERT INTO jobs VALUES 
        ( 'HR_REP'
        , 'Human Resources Representative'
        , 4000
        , 9000
        );

INSERT INTO jobs VALUES 
        ( 'PR_REP'
        , 'Public Relations Representative'
        , 4500
        , 10500
        );


REM ***************************insert data into the EMPLOYEES table

Prompt ******  Populating EMPLOYEES table ....

INSERT INTO employees VALUES 
        ( 100
        , 'Steven'
        , 'King'
        , 'SKING'
        , '515.123.4567'
        , TO_DATE('17-JUN-1987', 'dd-MON-yyyy')
        , 'AD_PRES'
        , 24000
        , NULL
        , NULL
        , 90
        );

INSERT INTO employees VALUES 
        ( 101
        , 'Neena'
        , 'Kochhar'
        , 'NKOCHHAR'
        , '515.123.4568'
        , TO_DATE('21-SEP-1989', 'dd-MON-yyyy')
        , 'AD_VP'
        , 17000
        , NULL
        , 100
        , 90
        );

INSERT INTO employees VALUES 
        ( 102
        , 'Lex'
        , 'De Haan'
        , 'LDEHAAN'
        , '515.123.4569'
        , TO_DATE('13-JAN-1993', 'dd-MON-yyyy')
        , 'AD_VP'
        , 17000
        , NULL
        , 100
        , 90
        );

INSERT INTO employees VALUES 
        ( 103
        , 'Alexander'
        , 'Hunold'
        , 'AHUNOLD'
        , '590.423.4567'
        , TO_DATE('03-JAN-1990', 'dd-MON-yyyy')
        , 'IT_PROG'
        , 9000
        , NULL
        , 102
        , 60
        );

INSERT INTO employees VALUES 
        ( 104
        , 'Bruce'
        , 'Ernst'
        , 'BERNST'
        , '590.423.4568'
        , TO_DATE('21-MAY-1991', 'dd-MON-yyyy')
        , 'IT_PROG'
        , 6000
        , NULL
        , 103
        , 60
        );

INSERT INTO employees VALUES 
        ( 105
        , 'David'
        , 'Austin'
        , 'DAUSTIN'
        , '590.423.4569'
        , TO_DATE('25-JUN-1997', 'dd-MON-yyyy')
        , 'IT_PROG'
        , 4800
        , NULL
        , 103
        , 60
        );

INSERT INTO employees VALUES 
        ( 106
        , 'Valli'
        , 'Pataballa'
        , 'VPATABAL'
        , '590.423.4560'
        , TO_DATE('05-FEB-1998', 'dd-MON-yyyy')
        , 'IT_PROG'
        , 4800
        , NULL
        , 103
        , 60
        );

INSERT INTO employees VALUES 
        ( 107
        , 'Diana'
        , 'Lorentz'
        , 'DLORENTZ'
        , '590.423.5567'
        , TO_DATE('07-FEB-1999', 'dd-MON-yyyy')
        , 'IT_PROG'
        , 4200
        , NULL
        , 103
        , 60
        );

INSERT INTO employees VALUES 
        ( 108
        , 'Nancy'
        , 'Greenberg'
        , 'NGREENBE'
        , '515.124.4569'
        , TO_DATE('17-AUG-1994', 'dd-MON-yyyy')
        , 'FI_MGR'
        , 12000
        , NULL
        , 101
        , 100
        );

INSERT INTO employees VALUES 
        ( 109
        , 'Daniel'
        , 'Faviet'
        , 'DFAVIET'
        , '515.124.4169'
        , TO_DATE('16-AUG-1994', 'dd-MON-yyyy')
        , 'FI_ACCOUNT'
        , 9000
        , NULL
        , 108
        , 100
        );

INSERT INTO employees VALUES 
        ( 110
        , 'John'
        , 'Chen'
        , 'JCHEN'
        , '515.124.4269'
        , TO_DATE('28-SEP-1997', 'dd-MON-yyyy')
        , 'FI_ACCOUNT'
        , 8200
        , NULL
        , 108
        , 100
        );

INSERT INTO employees VALUES 
        ( 111
        , 'Ismael'
        , 'Sciarra'
        , 'ISCIARRA'
        , '515.124.4369'
        , TO_DATE('30-SEP-1997', 'dd-MON-yyyy')
        , 'FI_ACCOUNT'
        , 7700
        , NULL
        , 108
        , 100
        );

INSERT INTO employees VALUES 
        ( 112
        , 'Jose Manuel'
        , 'Urman'
        , 'JMURMAN'
        , '515.124.4469'
        , TO_DATE('07-MAR-1998', 'dd-MON-yyyy')
        , 'FI_ACCOUNT'
        , 7800
        , NULL
        , 108
        , 100
        );

INSERT INTO employees VALUES 
        ( 113
        , 'Luis'
        , 'Popp'
        , 'LPOPP'
        , '515.124.4567'
        , TO_DATE('07-DEC-1999', 'dd-MON-yyyy')
        , 'FI_ACCOUNT'
        , 6900
        , NULL
        , 108
        , 100
        );

INSERT INTO employees VALUES 
        ( 114
        , 'Den'
        , 'Raphaely'
        , 'DRAPHEAL'
        , '515.127.4561'
        , TO_DATE('07-DEC-1994', 'dd-MON-yyyy')
        , 'PU_MAN'
        , 11000
        , NULL
        , 100
        , 30
        );

INSERT INTO employees VALUES 
        ( 115
        , 'Alexander'
        , 'Khoo'
        , 'AKHOO'
        , '515.127.4562'
        , TO_DATE('18-MAY-1995', 'dd-MON-yyyy')
        , 'PU_CLERK'
        , 3100
        , NULL
        , 114
        , 30
        );

INSERT INTO employees VALUES 
        ( 116
        , 'Shelli'
        , 'Baida'
        , 'SBAIDA'
        , '515.127.4563'
        , TO_DATE('24-DEC-1997', 'dd-MON-yyyy')
        , 'PU_CLERK'
        , 2900
        , NULL
        , 114
        , 30
        );

INSERT INTO employees VALUES 
        ( 117
        , 'Sigal'
        , 'Tobias'
        , 'STOBIAS'
        , '515.127.4564'
        , TO_DATE('24-JUL-1997', 'dd-MON-yyyy')
        , 'PU_CLERK'
        , 2800
        , NULL
        , 114
        , 30
        );

INSERT INTO employees VALUES 
        ( 118
        , 'Guy'
        , 'Himuro'
        , 'GHIMURO'
        , '515.127.4565'
        , TO_DATE('15-NOV-1998', 'dd-MON-yyyy')
        , 'PU_CLERK'
        , 2600
        , NULL
        , 114
        , 30
        );

INSERT INTO employees VALUES 
        ( 119
        , 'Karen'
        , 'Colmenares'
        , 'KCOLMENA'
        , '515.127.4566'
        , TO_DATE('10-AUG-1999', 'dd-MON-yyyy')
        , 'PU_CLERK'
        , 2500
        , NULL
        , 114
        , 30
        );

INSERT INTO employees VALUES 
        ( 120
        , 'Matthew'
        , 'Weiss'
        , 'MWEISS'
        , '650.123.1234'
        , TO_DATE('18-JUL-1996', 'dd-MON-yyyy')
        , 'ST_MAN'
        , 8000
        , NULL
        , 100
        , 50
        );

INSERT INTO employees VALUES 
        ( 121
        , 'Adam'
        , 'Fripp'
        , 'AFRIPP'
        , '650.123.2234'
        , TO_DATE('10-APR-1997', 'dd-MON-yyyy')
        , 'ST_MAN'
        , 8200
        , NULL
        , 100
        , 50
        );

INSERT INTO employees VALUES 
        ( 122
        , 'Payam'
        , 'Kaufling'
        , 'PKAUFLIN'
        , '650.123.3234'
        , TO_DATE('01-MAY-1995', 'dd-MON-yyyy')
        , 'ST_MAN'
        , 7900
        , NULL
        , 100
        , 50
        );

INSERT INTO employees VALUES 
        ( 123
        , 'Shanta'
        , 'Vollman'
        , 'SVOLLMAN'
        , '650.123.4234'
        , TO_DATE('10-OCT-1997', 'dd-MON-yyyy')
        , 'ST_MAN'
        , 6500
        , NULL
        , 100
        , 50
        );

INSERT INTO employees VALUES 
        ( 124
        , 'Kevin'
        , 'Mourgos'
        , 'KMOURGOS'
        , '650.123.5234'
        , TO_DATE('16-NOV-1999', 'dd-MON-yyyy')
        , 'ST_MAN'
        , 5800
        , NULL
        , 100
        , 50
        );

INSERT INTO employees VALUES 
        ( 125
        , 'Julia'
        , 'Nayer'
        , 'JNAYER'
        , '650.124.1214'
        , TO_DATE('16-JUL-1997', 'dd-MON-yyyy')
        , 'ST_CLERK'
        , 3200
        , NULL
        , 120
        , 50
        );

INSERT INTO employees VALUES 
        ( 126
        , 'Irene'
        , 'Mikkilineni'
        , 'IMIKKILI'
        , '650.124.1224'
        , TO_DATE('28-SEP-1998', 'dd-MON-yyyy')
        , 'ST_CLERK'
        , 2700
        , NULL
        , 120
        , 50
        );

INSERT INTO employees VALUES 
        ( 127
        , 'James'
        , 'Landry'
        , 'JLANDRY'
        , '650.124.1334'
        , TO_DATE('14-JAN-1999', 'dd-MON-yyyy')
        , 'ST_CLERK'
        , 2400
        , NULL
        , 120
        , 50
        );

INSERT INTO employees VALUES 
        ( 128
        , 'Steven'
        , 'Markle'
        , 'SMARKLE'
        , '650.124.1434'
        , TO_DATE('08-MAR-2000', 'dd-MON-yyyy')
        , 'ST_CLERK'
        , 2200
        , NULL
        , 120
        , 50
        );

INSERT INTO employees VALUES 
        ( 129
        , 'Laura'
        , 'Bissot'
        , 'LBISSOT'
        , '650.124.5234'
        , TO_DATE('20-AUG-1997', 'dd-MON-yyyy')
        , 'ST_CLERK'
        , 3300
        , NULL
        , 121
        , 50
        );

INSERT INTO employees VALUES 
        ( 130
        , 'Mozhe'
        , 'Atkinson'
        , 'MATKINSO'
        , '650.124.6234'
        , TO_DATE('30-OCT-1997', 'dd-MON-yyyy')
        , 'ST_CLERK'
        , 2800
        , NULL
        , 121
        , 50
        );

INSERT INTO employees VALUES 
        ( 131
        , 'James'
        , 'Marlow'
        , 'JAMRLOW'
        , '650.124.7234'
        , TO_DATE('16-FEB-1997', 'dd-MON-yyyy')
        , 'ST_CLERK'
        , 2500
        , NULL
        , 121
        , 50
        );

INSERT INTO employees VALUES 
        ( 132
        , 'TJ'
        , 'Olson'
        , 'TJOLSON'
        , '650.124.8234'
        , TO_DATE('10-APR-1999', 'dd-MON-yyyy')
        , 'ST_CLERK'
        , 2100
        , NULL
        , 121
        , 50
        );

INSERT INTO employees VALUES 
        ( 133
        , 'Jason'
        , 'Mallin'
        , 'JMALLIN'
        , '650.127.1934'
        , TO_DATE('14-JUN-1996', 'dd-MON-yyyy')
        , 'ST_CLERK'
        , 3300
        , NULL
        , 122
        , 50
        );

INSERT INTO employees VALUES 
        ( 134
        , 'Michael'
        , 'Rogers'
        , 'MROGERS'
        , '650.127.1834'
        , TO_DATE('26-AUG-1998', 'dd-MON-yyyy')
        , 'ST_CLERK'
        , 2900
        , NULL
        , 122
        , 50
        );

INSERT INTO employees VALUES 
        ( 135
        , 'Ki'
        , 'Gee'
        , 'KGEE'
        , '650.127.1734'
        , TO_DATE('12-DEC-1999', 'dd-MON-yyyy')
        , 'ST_CLERK'
        , 2400
        , NULL
        , 122
        , 50
        );

INSERT INTO employees VALUES 
        ( 136
        , 'Hazel'
        , 'Philtanker'
        , 'HPHILTAN'
        , '650.127.1634'
        , TO_DATE('06-FEB-2000', 'dd-MON-yyyy')
        , 'ST_CLERK'
        , 2200
        , NULL
        , 122
        , 50
        );

INSERT INTO employees VALUES 
        ( 137
        , 'Renske'
        , 'Ladwig'
        , 'RLADWIG'
        , '650.121.1234'
        , TO_DATE('14-JUL-1995', 'dd-MON-yyyy')
        , 'ST_CLERK'
        , 3600
        , NULL
        , 123
        , 50
        );

INSERT INTO employees VALUES 
        ( 138
        , 'Stephen'
        , 'Stiles'
        , 'SSTILES'
        , '650.121.2034'
        , TO_DATE('26-OCT-1997', 'dd-MON-yyyy')
        , 'ST_CLERK'
        , 3200
        , NULL
        , 123
        , 50
        );

INSERT INTO employees VALUES 
        ( 139
        , 'John'
        , 'Seo'
        , 'JSEO'
        , '650.121.2019'
        , TO_DATE('12-FEB-1998', 'dd-MON-yyyy')
        , 'ST_CLERK'
        , 2700
        , NULL
        , 123
        , 50
        );

INSERT INTO employees VALUES 
        ( 140
        , 'Joshua'
        , 'Patel'
        , 'JPATEL'
        , '650.121.1834'
        , TO_DATE('06-APR-1998', 'dd-MON-yyyy')
        , 'ST_CLERK'
        , 2500
        , NULL
        , 123
        , 50
        );

INSERT INTO employees VALUES 
        ( 141
        , 'Trenna'
        , 'Rajs'
        , 'TRAJS'
        , '650.121.8009'
        , TO_DATE('17-OCT-1995', 'dd-MON-yyyy')
        , 'ST_CLERK'
        , 3500
        , NULL
        , 124
        , 50
        );

INSERT INTO employees VALUES 
        ( 142
        , 'Curtis'
        , 'Davies'
        , 'CDAVIES'
        , '650.121.2994'
        , TO_DATE('29-JAN-1997', 'dd-MON-yyyy')
        , 'ST_CLERK'
        , 3100
        , NULL
        , 124
        , 50
        );

INSERT INTO employees VALUES 
        ( 143
        , 'Randall'
        , 'Matos'
        , 'RMATOS'
        , '650.121.2874'
        , TO_DATE('15-MAR-1998', 'dd-MON-yyyy')
        , 'ST_CLERK'
        , 2600
        , NULL
        , 124
        , 50
        );

INSERT INTO employees VALUES 
        ( 144
        , 'Peter'
        , 'Vargas'
        , 'PVARGAS'
        , '650.121.2004'
        , TO_DATE('09-JUL-1998', 'dd-MON-yyyy')
        , 'ST_CLERK'
        , 2500
        , NULL
        , 124
        , 50
        );

INSERT INTO employees VALUES 
        ( 145
        , 'John'
        , 'Russell'
        , 'JRUSSEL'
        , '011.44.1344.429268'
        , TO_DATE('01-OCT-1996', 'dd-MON-yyyy')
        , 'SA_MAN'
        , 14000
        , .4
        , 100
        , 80
        );

INSERT INTO employees VALUES 
        ( 146
        , 'Karen'
        , 'Partners'
        , 'KPARTNER'
        , '011.44.1344.467268'
        , TO_DATE('05-JAN-1997', 'dd-MON-yyyy')
        , 'SA_MAN'
        , 13500
        , .3
        , 100
        , 80
        );

INSERT INTO employees VALUES 
        ( 147
        , 'Alberto'
        , 'Errazuriz'
        , 'AERRAZUR'
        , '011.44.1344.429278'
        , TO_DATE('10-MAR-1997', 'dd-MON-yyyy')
        , 'SA_MAN'
        , 12000
        , .3
        , 100
        , 80
        );

INSERT INTO employees VALUES 
        ( 148
        , 'Gerald'
        , 'Cambrault'
        , 'GCAMBRAU'
        , '011.44.1344.619268'
        , TO_DATE('15-OCT-1999', 'dd-MON-yyyy')
        , 'SA_MAN'
        , 11000
        , .3
        , 100
        , 80
        );

INSERT INTO employees VALUES 
        ( 149
        , 'Eleni'
        , 'Zlotkey'
        , 'EZLOTKEY'
        , '011.44.1344.429018'
        , TO_DATE('29-JAN-2000', 'dd-MON-yyyy')
        , 'SA_MAN'
        , 10500
        , .2
        , 100
        , 80
        );

INSERT INTO employees VALUES 
        ( 150
        , 'Peter'
        , 'Tucker'
        , 'PTUCKER'
        , '011.44.1344.129268'
        , TO_DATE('30-JAN-1997', 'dd-MON-yyyy')
        , 'SA_REP'
        , 10000
        , .3
        , 145
        , 80
        );

INSERT INTO employees VALUES 
        ( 151
        , 'David'
        , 'Bernstein'
        , 'DBERNSTE'
        , '011.44.1344.345268'
        , TO_DATE('24-MAR-1997', 'dd-MON-yyyy')
        , 'SA_REP'
        , 9500
        , .25
        , 145
        , 80
        );

INSERT INTO employees VALUES 
        ( 152
        , 'Peter'
        , 'Hall'
        , 'PHALL'
        , '011.44.1344.478968'
        , TO_DATE('20-AUG-1997', 'dd-MON-yyyy')
        , 'SA_REP'
        , 9000
        , .25
        , 145
        , 80
        );

INSERT INTO employees VALUES 
        ( 153
        , 'Christopher'
        , 'Olsen'
        , 'COLSEN'
        , '011.44.1344.498718'
        , TO_DATE('30-MAR-1998', 'dd-MON-yyyy')
        , 'SA_REP'
        , 8000
        , .2
        , 145
        , 80
        );

INSERT INTO employees VALUES 
        ( 154
        , 'Nanette'
        , 'Cambrault'
        , 'NCAMBRAU'
        , '011.44.1344.987668'
        , TO_DATE('09-DEC-1998', 'dd-MON-yyyy')
        , 'SA_REP'
        , 7500
        , .2
        , 145
        , 80
        );

INSERT INTO employees VALUES 
        ( 155
        , 'Oliver'
        , 'Tuvault'
        , 'OTUVAULT'
        , '011.44.1344.486508'
        , TO_DATE('23-NOV-1999', 'dd-MON-yyyy')
        , 'SA_REP'
        , 7000
        , .15
        , 145
        , 80
        );

INSERT INTO employees VALUES 
        ( 156
        , 'Janette'
        , 'King'
        , 'JKING'
        , '011.44.1345.429268'
        , TO_DATE('30-JAN-1996', 'dd-MON-yyyy')
        , 'SA_REP'
        , 10000
        , .35
        , 146
        , 80
        );

INSERT INTO employees VALUES 
        ( 157
        , 'Patrick'
        , 'Sully'
        , 'PSULLY'
        , '011.44.1345.929268'
        , TO_DATE('04-MAR-1996', 'dd-MON-yyyy')
        , 'SA_REP'
        , 9500
        , .35
        , 146
        , 80
        );

INSERT INTO employees VALUES 
        ( 158
        , 'Allan'
        , 'McEwen'
        , 'AMCEWEN'
        , '011.44.1345.829268'
        , TO_DATE('01-AUG-1996', 'dd-MON-yyyy')
        , 'SA_REP'
        , 9000
        , .35
        , 146
        , 80
        );

INSERT INTO employees VALUES 
        ( 159
        , 'Lindsey'
        , 'Smith'
        , 'LSMITH'
        , '011.44.1345.729268'
        , TO_DATE('10-MAR-1997', 'dd-MON-yyyy')
        , 'SA_REP'
        , 8000
        , .3
        , 146
        , 80
        );

INSERT INTO employees VALUES 
        ( 160
        , 'Louise'
        , 'Doran'
        , 'LDORAN'
        , '011.44.1345.629268'
        , TO_DATE('15-DEC-1997', 'dd-MON-yyyy')
        , 'SA_REP'
        , 7500
        , .3
        , 146
        , 80
        );

INSERT INTO employees VALUES 
        ( 161
        , 'Sarath'
        , 'Sewall'
        , 'SSEWALL'
        , '011.44.1345.529268'
        , TO_DATE('03-NOV-1998', 'dd-MON-yyyy')
        , 'SA_REP'
        , 7000
        , .25
        , 146
        , 80
        );

INSERT INTO employees VALUES 
        ( 162
        , 'Clara'
        , 'Vishney'
        , 'CVISHNEY'
        , '011.44.1346.129268'
        , TO_DATE('11-NOV-1997', 'dd-MON-yyyy')
        , 'SA_REP'
        , 10500
        , .25
        , 147
        , 80
        );

INSERT INTO employees VALUES 
        ( 163
        , 'Danielle'
        , 'Greene'
        , 'DGREENE'
        , '011.44.1346.229268'
        , TO_DATE('19-MAR-1999', 'dd-MON-yyyy')
        , 'SA_REP'
        , 9500
        , .15
        , 147
        , 80
        );

INSERT INTO employees VALUES 
        ( 164
        , 'Mattea'
        , 'Marvins'
        , 'MMARVINS'
        , '011.44.1346.329268'
        , TO_DATE('24-JAN-2000', 'dd-MON-yyyy')
        , 'SA_REP'
        , 7200
        , .10
        , 147
        , 80
        );

INSERT INTO employees VALUES 
        ( 165
        , 'David'
        , 'Lee'
        , 'DLEE'
        , '011.44.1346.529268'
        , TO_DATE('23-FEB-2000', 'dd-MON-yyyy')
        , 'SA_REP'
        , 6800
        , .1
        , 147
        , 80
        );

INSERT INTO employees VALUES 
        ( 166
        , 'Sundar'
        , 'Ande'
        , 'SANDE'
        , '011.44.1346.629268'
        , TO_DATE('24-MAR-2000', 'dd-MON-yyyy')
        , 'SA_REP'
        , 6400
        , .10
        , 147
        , 80
        );

INSERT INTO employees VALUES 
        ( 167
        , 'Amit'
        , 'Banda'
        , 'ABANDA'
        , '011.44.1346.729268'
        , TO_DATE('21-APR-2000', 'dd-MON-yyyy')
        , 'SA_REP'
        , 6200
        , .10
        , 147
        , 80
        );

INSERT INTO employees VALUES 
        ( 168
        , 'Lisa'
        , 'Ozer'
        , 'LOZER'
        , '011.44.1343.929268'
        , TO_DATE('11-MAR-1997', 'dd-MON-yyyy')
        , 'SA_REP'
        , 11500
        , .25
        , 148
        , 80
        );

INSERT INTO employees VALUES 
        ( 169  
        , 'Harrison'
        , 'Bloom'
        , 'HBLOOM'
        , '011.44.1343.829268'
        , TO_DATE('23-MAR-1998', 'dd-MON-yyyy')
        , 'SA_REP'
        , 10000
        , .20
        , 148
        , 80
        );

INSERT INTO employees VALUES 
        ( 170
        , 'Tayler'
        , 'Fox'
        , 'TFOX'
        , '011.44.1343.729268'
        , TO_DATE('24-JAN-1998', 'dd-MON-yyyy')
        , 'SA_REP'
        , 9600
        , .20
        , 148
        , 80
        );

INSERT INTO employees VALUES 
        ( 171
        , 'William'
        , 'Smith'
        , 'WSMITH'
        , '011.44.1343.629268'
        , TO_DATE('23-FEB-1999', 'dd-MON-yyyy')
        , 'SA_REP'
        , 7400
        , .15
        , 148
        , 80
        );

INSERT INTO employees VALUES 
        ( 172
        , 'Elizabeth'
        , 'Bates'
        , 'EBATES'
        , '011.44.1343.529268'
        , TO_DATE('24-MAR-1999', 'dd-MON-yyyy')
        , 'SA_REP'
        , 7300
        , .15
        , 148
        , 80
        );

INSERT INTO employees VALUES 
        ( 173
        , 'Sundita'
        , 'Kumar'
        , 'SKUMAR'
        , '011.44.1343.329268'
        , TO_DATE('21-APR-2000', 'dd-MON-yyyy')
        , 'SA_REP'
        , 6100
        , .10
        , 148
        , 80
        );

INSERT INTO employees VALUES 
        ( 174
        , 'Ellen'
        , 'Abel'
        , 'EABEL'
        , '011.44.1644.429267'
        , TO_DATE('11-MAY-1996', 'dd-MON-yyyy')
        , 'SA_REP'
        , 11000
        , .30
        , 149
        , 80
        );

INSERT INTO employees VALUES 
        ( 175
        , 'Alyssa'
        , 'Hutton'
        , 'AHUTTON'
        , '011.44.1644.429266'
        , TO_DATE('19-MAR-1997', 'dd-MON-yyyy')
        , 'SA_REP'
        , 8800
        , .25
        , 149
        , 80
        );

INSERT INTO employees VALUES 
        ( 176
        , 'Jonathon'
        , 'Taylor'
        , 'JTAYLOR'
        , '011.44.1644.429265'
        , TO_DATE('24-MAR-1998', 'dd-MON-yyyy')
        , 'SA_REP'
        , 8600
        , .20
        , 149
        , 80
        );

INSERT INTO employees VALUES 
        ( 177
        , 'Jack'
        , 'Livingston'
        , 'JLIVINGS'
        , '011.44.1644.429264'
        , TO_DATE('23-APR-1998', 'dd-MON-yyyy')
        , 'SA_REP'
        , 8400
        , .20
        , 149
        , 80
        );

INSERT INTO employees VALUES 
        ( 178
        , 'Kimberely'
        , 'Grant'
        , 'KGRANT'
        , '011.44.1644.429263'
        , TO_DATE('24-MAY-1999', 'dd-MON-yyyy')
        , 'SA_REP'
        , 7000
        , .15
        , 149
        , NULL
        );

INSERT INTO employees VALUES 
        ( 179
        , 'Charles'
        , 'Johnson'
        , 'CJOHNSON'
        , '011.44.1644.429262'
        , TO_DATE('04-JAN-2000', 'dd-MON-yyyy')
        , 'SA_REP'
        , 6200
        , .10
        , 149
        , 80
        );

INSERT INTO employees VALUES 
        ( 180
        , 'Winston'
        , 'Taylor'
        , 'WTAYLOR'
        , '650.507.9876'
        , TO_DATE('24-JAN-1998', 'dd-MON-yyyy')
        , 'SH_CLERK'
        , 3200
        , NULL
        , 120
        , 50
        );

INSERT INTO employees VALUES 
        ( 181
        , 'Jean'
        , 'Fleaur'
        , 'JFLEAUR'
        , '650.507.9877'
        , TO_DATE('23-FEB-1998', 'dd-MON-yyyy')
        , 'SH_CLERK'
        , 3100
        , NULL
        , 120
        , 50
        );

INSERT INTO employees VALUES 
        ( 182
        , 'Martha'
        , 'Sullivan'
        , 'MSULLIVA'
        , '650.507.9878'
        , TO_DATE('21-JUN-1999', 'dd-MON-yyyy')
        , 'SH_CLERK'
        , 2500
        , NULL
        , 120
        , 50
        );

INSERT INTO employees VALUES 
        ( 183
        , 'Girard'
        , 'Geoni'
        , 'GGEONI'
        , '650.507.9879'
        , TO_DATE('03-FEB-2000', 'dd-MON-yyyy')
        , 'SH_CLERK'
        , 2800
        , NULL
        , 120
        , 50
        );

INSERT INTO employees VALUES 
        ( 184
        , 'Nandita'
        , 'Sarchand'
        , 'NSARCHAN'
        , '650.509.1876'
        , TO_DATE('27-JAN-1996', 'dd-MON-yyyy')
        , 'SH_CLERK'
        , 4200
        , NULL
        , 121
        , 50
        );

INSERT INTO employees VALUES 
        ( 185
        , 'Alexis'
        , 'Bull'
        , 'ABULL'
        , '650.509.2876'
        , TO_DATE('20-FEB-1997', 'dd-MON-yyyy')
        , 'SH_CLERK'
        , 4100
        , NULL
        , 121
        , 50
        );

INSERT INTO employees VALUES 
        ( 186
        , 'Julia'
        , 'Dellinger'
        , 'JDELLING'
        , '650.509.3876'
        , TO_DATE('24-JUN-1998', 'dd-MON-yyyy')
        , 'SH_CLERK'
        , 3400
        , NULL
        , 121
        , 50
        );

INSERT INTO employees VALUES 
        ( 187
        , 'Anthony'
        , 'Cabrio'
        , 'ACABRIO'
        , '650.509.4876'
        , TO_DATE('07-FEB-1999', 'dd-MON-yyyy')
        , 'SH_CLERK'
        , 3000
        , NULL
        , 121
        , 50
        );

INSERT INTO employees VALUES 
        ( 188
        , 'Kelly'
        , 'Chung'
        , 'KCHUNG'
        , '650.505.1876'
        , TO_DATE('14-JUN-1997', 'dd-MON-yyyy')
        , 'SH_CLERK'
        , 3800
        , NULL
        , 122
        , 50
        );

INSERT INTO employees VALUES 
        ( 189
        , 'Jennifer'
        , 'Dilly'
        , 'JDILLY'
        , '650.505.2876'
        , TO_DATE('13-AUG-1997', 'dd-MON-yyyy')
        , 'SH_CLERK'
        , 3600
        , NULL
        , 122
        , 50
        );

INSERT INTO employees VALUES 
        ( 190
        , 'Timothy'
        , 'Gates'
        , 'TGATES'
        , '650.505.3876'
        , TO_DATE('11-JUL-1998', 'dd-MON-yyyy')
        , 'SH_CLERK'
        , 2900
        , NULL
        , 122
        , 50
        );

INSERT INTO employees VALUES 
        ( 191
        , 'Randall'
        , 'Perkins'
        , 'RPERKINS'
        , '650.505.4876'
        , TO_DATE('19-DEC-1999', 'dd-MON-yyyy')
        , 'SH_CLERK'
        , 2500
        , NULL
        , 122
        , 50
        );

INSERT INTO employees VALUES 
        ( 192
        , 'Sarah'
        , 'Bell'
        , 'SBELL'
        , '650.501.1876'
        , TO_DATE('04-FEB-1996', 'dd-MON-yyyy')
        , 'SH_CLERK'
        , 4000
        , NULL
        , 123
        , 50
        );

INSERT INTO employees VALUES 
        ( 193
        , 'Britney'
        , 'Everett'
        , 'BEVERETT'
        , '650.501.2876'
        , TO_DATE('03-MAR-1997', 'dd-MON-yyyy')
        , 'SH_CLERK'
        , 3900
        , NULL
        , 123
        , 50
        );

INSERT INTO employees VALUES 
        ( 194
        , 'Samuel'
        , 'McCain'
        , 'SMCCAIN'
        , '650.501.3876'
        , TO_DATE('01-JUL-1998', 'dd-MON-yyyy')
        , 'SH_CLERK'
        , 3200
        , NULL
        , 123
        , 50
        );

INSERT INTO employees VALUES 
        ( 195
        , 'Vance'
        , 'Jones'
        , 'VJONES'
        , '650.501.4876'
        , TO_DATE('17-MAR-1999', 'dd-MON-yyyy')
        , 'SH_CLERK'
        , 2800
        , NULL
        , 123
        , 50
        );

INSERT INTO employees VALUES 
        ( 196
        , 'Alana'
        , 'Walsh'
        , 'AWALSH'
        , '650.507.9811'
        , TO_DATE('24-APR-1998', 'dd-MON-yyyy')
        , 'SH_CLERK'
        , 3100
        , NULL
        , 124
        , 50
        );

INSERT INTO employees VALUES 
        ( 197
        , 'Kevin'
        , 'Feeney'
        , 'KFEENEY'
        , '650.507.9822'
        , TO_DATE('23-MAY-1998', 'dd-MON-yyyy')
        , 'SH_CLERK'
        , 3000
        , NULL
        , 124
        , 50
        );

INSERT INTO employees VALUES 
        ( 198
        , 'Donald'
        , 'OConnell'
        , 'DOCONNEL'
        , '650.507.9833'
        , TO_DATE('21-JUN-1999', 'dd-MON-yyyy')
        , 'SH_CLERK'
        , 2600
        , NULL
        , 124
        , 50
        );

INSERT INTO employees VALUES 
        ( 199
        , 'Douglas'
        , 'Grant'
        , 'DGRANT'
        , '650.507.9844'
        , TO_DATE('13-JAN-2000', 'dd-MON-yyyy')
        , 'SH_CLERK'
        , 2600
        , NULL
        , 124
        , 50
        );

INSERT INTO employees VALUES 
        ( 200
        , 'Jennifer'
        , 'Whalen'
        , 'JWHALEN'
        , '515.123.4444'
        , TO_DATE('17-SEP-1987', 'dd-MON-yyyy')
        , 'AD_ASST'
        , 4400
        , NULL
        , 101
        , 10
        );

INSERT INTO employees VALUES 
        ( 201
        , 'Michael'
        , 'Hartstein'
        , 'MHARTSTE'
        , '515.123.5555'
        , TO_DATE('17-FEB-1996', 'dd-MON-yyyy')
        , 'MK_MAN'
        , 13000
        , NULL
        , 100
        , 20
        );

INSERT INTO employees VALUES 
        ( 202
        , 'Pat'
        , 'Fay'
        , 'PFAY'
        , '603.123.6666'
        , TO_DATE('17-AUG-1997', 'dd-MON-yyyy')
        , 'MK_REP'
        , 6000
        , NULL
        , 201
        , 20
        );

INSERT INTO employees VALUES 
        ( 203
        , 'Susan'
        , 'Mavris'
        , 'SMAVRIS'
        , '515.123.7777'
        , TO_DATE('07-JUN-1994', 'dd-MON-yyyy')
        , 'HR_REP'
        , 6500
        , NULL
        , 101
        , 40
        );

INSERT INTO employees VALUES 
        ( 204
        , 'Hermann'
        , 'Baer'
        , 'HBAER'
        , '515.123.8888'
        , TO_DATE('07-JUN-1994', 'dd-MON-yyyy')
        , 'PR_REP'
        , 10000
        , NULL
        , 101
        , 70
        );

INSERT INTO employees VALUES 
        ( 205
        , 'Shelley'
        , 'Higgins'
        , 'SHIGGINS'
        , '515.123.8080'
        , TO_DATE('07-JUN-1994', 'dd-MON-yyyy')
        , 'AC_MGR'
        , 12000
        , NULL
        , 101
        , 110
        );

INSERT INTO employees VALUES 
        ( 206
        , 'William'
        , 'Gietz'
        , 'WGIETZ'
        , '515.123.8181'
        , TO_DATE('07-JUN-1994', 'dd-MON-yyyy')
        , 'AC_ACCOUNT'
        , 8300
        , NULL
        , 205
        , 110
        );

REM ********* insert data into the JOB_HISTORY table

Prompt ******  Populating JOB_HISTORY table ....


INSERT INTO job_history
VALUES (102
       , TO_DATE('13-JAN-1993', 'dd-MON-yyyy')
       , TO_DATE('24-JUL-1998', 'dd-MON-yyyy')
       , 'IT_PROG'
       , 60);

INSERT INTO job_history
VALUES (101
       , TO_DATE('21-SEP-1989', 'dd-MON-yyyy')
       , TO_DATE('27-OCT-1993', 'dd-MON-yyyy')
       , 'AC_ACCOUNT'
       , 110);

INSERT INTO job_history
VALUES (101
       , TO_DATE('28-OCT-1993', 'dd-MON-yyyy')
       , TO_DATE('15-MAR-1997', 'dd-MON-yyyy')
       , 'AC_MGR'
       , 110);

INSERT INTO job_history
VALUES (201
       , TO_DATE('17-FEB-1996', 'dd-MON-yyyy')
       , TO_DATE('19-DEC-1999', 'dd-MON-yyyy')
       , 'MK_REP'
       , 20);

INSERT INTO job_history
VALUES  (114
        , TO_DATE('24-MAR-1998', 'dd-MON-yyyy')
        , TO_DATE('31-DEC-1999', 'dd-MON-yyyy')
        , 'ST_CLERK'
        , 50
        );

INSERT INTO job_history
VALUES  (122
        , TO_DATE('01-JAN-1999', 'dd-MON-yyyy')
        , TO_DATE('31-DEC-1999', 'dd-MON-yyyy')
        , 'ST_CLERK'
        , 50
        );

INSERT INTO job_history
VALUES  (200
        , TO_DATE('17-SEP-1987', 'dd-MON-yyyy')
        , TO_DATE('17-JUN-1993', 'dd-MON-yyyy')
        , 'AD_ASST'
        , 90
        );

INSERT INTO job_history
VALUES  (176
        , TO_DATE('24-MAR-1998', 'dd-MON-yyyy')
        , TO_DATE('31-DEC-1998', 'dd-MON-yyyy')
        , 'SA_REP'
        , 80
        );

INSERT INTO job_history
VALUES  (176
        , TO_DATE('01-JAN-1999', 'dd-MON-yyyy')
        , TO_DATE('31-DEC-1999', 'dd-MON-yyyy')
        , 'SA_MAN'
        , 80
        );

INSERT INTO job_history
VALUES  (200
        , TO_DATE('01-JUL-1994', 'dd-MON-yyyy')
        , TO_DATE('31-DEC-1998', 'dd-MON-yyyy')
        , 'AC_ACCOUNT'
        , 90
        );

REM enable integrity constraint to DEPARTMENTS

ALTER TABLE departments 
  ENABLE CONSTRAINT dept_mgr_fk;

COMMIT;

马哥私房菜博客地址:https://github.com/mageSFC/myblog 

第二个表

马哥私房菜博客地址:https://github.com/mageSFC/myblog 

SET FEEDBACK 1
SET NUMWIDTH 10
SET LINESIZE 80
SET TRIMSPOOL ON
SET TAB OFF
SET PAGESIZE 100
SET ECHO OFF 

REM ********************************************************************
REM Create the REGIONS table to hold region information for locations
REM HR.LOCATIONS table has a foreign key to this table.

Prompt ******  Creating REGIONS table ....

CREATE TABLE regions
    ( region_id      NUMBER 
       CONSTRAINT  region_id_nn NOT NULL 
    , region_name    VARCHAR2(25) 
    );

CREATE UNIQUE INDEX reg_id_pk
ON regions (region_id);

ALTER TABLE regions
ADD ( CONSTRAINT reg_id_pk
             PRIMARY KEY (region_id)
    ) ;

REM ********************************************************************
REM Create the COUNTRIES table to hold country information for customers
REM and company locations. 
REM OE.CUSTOMERS table and HR.LOCATIONS have a foreign key to this table.

Prompt ******  Creating COUNTRIES table ....

CREATE TABLE countries 
    ( country_id      CHAR(2) 
       CONSTRAINT  country_id_nn NOT NULL 
    , country_name    VARCHAR2(40) 
    , region_id       NUMBER 
    , CONSTRAINT     country_c_id_pk 
                 PRIMARY KEY (country_id) 
    ) 
    ORGANIZATION INDEX; 

ALTER TABLE countries
ADD ( CONSTRAINT countr_reg_fk
             FOREIGN KEY (region_id)
              REFERENCES regions(region_id) 
    ) ;

REM ********************************************************************
REM Create the LOCATIONS table to hold address information for company departments.
REM HR.DEPARTMENTS has a foreign key to this table.

Prompt ******  Creating LOCATIONS table ....

CREATE TABLE locations
    ( location_id    NUMBER(4)
    , street_address VARCHAR2(40)
    , postal_code    VARCHAR2(12)
    , city       VARCHAR2(30)
    CONSTRAINT     loc_city_nn  NOT NULL
    , state_province VARCHAR2(25)
    , country_id     CHAR(2)
    ) ;

CREATE UNIQUE INDEX loc_id_pk
ON locations (location_id) ;

ALTER TABLE locations
ADD ( CONSTRAINT loc_id_pk
             PRIMARY KEY (location_id)
    , CONSTRAINT loc_c_id_fk
             FOREIGN KEY (country_id)
              REFERENCES countries(country_id) 
    ) ;

Rem     Useful for any subsequent addition of rows to locations table
Rem     Starts with 3300

CREATE SEQUENCE locations_seq
 START WITH     3300
 INCREMENT BY   100
 MAXVALUE       9900
 NOCACHE
 NOCYCLE;

REM ********************************************************************
REM Create the DEPARTMENTS table to hold company department information.
REM HR.EMPLOYEES and HR.JOB_HISTORY have a foreign key to this table.

Prompt ******  Creating DEPARTMENTS table ....

CREATE TABLE departments
    ( department_id    NUMBER(4)
    , department_name  VARCHAR2(30)
    CONSTRAINT  dept_name_nn  NOT NULL
    , manager_id       NUMBER(6)
    , location_id      NUMBER(4)
    ) ;

CREATE UNIQUE INDEX dept_id_pk
ON departments (department_id) ;

ALTER TABLE departments
ADD ( CONSTRAINT dept_id_pk
             PRIMARY KEY (department_id)
    , CONSTRAINT dept_loc_fk
             FOREIGN KEY (location_id)
              REFERENCES locations (location_id)
     ) ;

Rem     Useful for any subsequent addition of rows to departments table
Rem     Starts with 280 

CREATE SEQUENCE departments_seq
 START WITH     280
 INCREMENT BY   10
 MAXVALUE       9990
 NOCACHE
 NOCYCLE;

REM ********************************************************************
REM Create the JOBS table to hold the different names of job roles within the company.
REM HR.EMPLOYEES has a foreign key to this table.

Prompt ******  Creating JOBS table ....

CREATE TABLE jobs
    ( job_id         VARCHAR2(10)
    , job_title      VARCHAR2(35)
    CONSTRAINT     job_title_nn  NOT NULL
    , min_salary     NUMBER(6)
    , max_salary     NUMBER(6)
    ) ;

CREATE UNIQUE INDEX job_id_pk 
ON jobs (job_id) ;

ALTER TABLE jobs
ADD ( CONSTRAINT job_id_pk
             PRIMARY KEY(job_id)
    ) ;

REM ********************************************************************
REM Create the EMPLOYEES table to hold the employee personnel 
REM information for the company.
REM HR.EMPLOYEES has a self referencing foreign key to this table.

Prompt ******  Creating EMPLOYEES table ....

CREATE TABLE employees
    ( employee_id    NUMBER(6)
    , first_name     VARCHAR2(20)
    , last_name      VARCHAR2(25)
     CONSTRAINT     emp_last_name_nn  NOT NULL
    , email          VARCHAR2(25)
    CONSTRAINT     emp_email_nn  NOT NULL
    , phone_number   VARCHAR2(20)
    , hire_date      DATE
    CONSTRAINT     emp_hire_date_nn  NOT NULL
    , job_id         VARCHAR2(10)
    CONSTRAINT     emp_job_nn  NOT NULL
    , salary         NUMBER(8,2)
    , commission_pct NUMBER(2,2)
    , manager_id     NUMBER(6)
    , department_id  NUMBER(4)
    , CONSTRAINT     emp_salary_min
                     CHECK (salary > 0) 
    , CONSTRAINT     emp_email_uk
                     UNIQUE (email)
    ) ;

CREATE UNIQUE INDEX emp_emp_id_pk
ON employees (employee_id) ;


ALTER TABLE employees
ADD ( CONSTRAINT     emp_emp_id_pk
                     PRIMARY KEY (employee_id)
    , CONSTRAINT     emp_dept_fk
                     FOREIGN KEY (department_id)
                      REFERENCES departments
    , CONSTRAINT     emp_job_fk
                     FOREIGN KEY (job_id)
                      REFERENCES jobs (job_id)
    , CONSTRAINT     emp_manager_fk
                     FOREIGN KEY (manager_id)
                      REFERENCES employees
    ) ;

ALTER TABLE departments
ADD ( CONSTRAINT dept_mgr_fk
             FOREIGN KEY (manager_id)
              REFERENCES employees (employee_id)
    ) ;


Rem     Useful for any subsequent addition of rows to employees table
Rem     Starts with 207 


CREATE SEQUENCE employees_seq
 START WITH     207
 INCREMENT BY   1
 NOCACHE
 NOCYCLE;

REM ********************************************************************
REM Create the JOB_HISTORY table to hold the history of jobs that 
REM employees have held in the past.
REM HR.JOBS, HR_DEPARTMENTS, and HR.EMPLOYEES have a foreign key to this table.

Prompt ******  Creating JOB_HISTORY table ....

CREATE TABLE job_history
    ( employee_id   NUMBER(6)
     CONSTRAINT    jhist_employee_nn  NOT NULL
    , start_date    DATE
    CONSTRAINT    jhist_start_date_nn  NOT NULL
    , end_date      DATE
    CONSTRAINT    jhist_end_date_nn  NOT NULL
    , job_id        VARCHAR2(10)
    CONSTRAINT    jhist_job_nn  NOT NULL
    , department_id NUMBER(4)
    , CONSTRAINT    jhist_date_interval
                    CHECK (end_date > start_date)
    ) ;

CREATE UNIQUE INDEX jhist_emp_id_st_date_pk 
ON job_history (employee_id, start_date) ;

ALTER TABLE job_history
ADD ( CONSTRAINT jhist_emp_id_st_date_pk
      PRIMARY KEY (employee_id, start_date)
    , CONSTRAINT     jhist_job_fk
                     FOREIGN KEY (job_id)
                     REFERENCES jobs
    , CONSTRAINT     jhist_emp_fk
                     FOREIGN KEY (employee_id)
                     REFERENCES employees
    , CONSTRAINT     jhist_dept_fk
                     FOREIGN KEY (department_id)
                     REFERENCES departments
    ) ;

REM ********************************************************************
REM Create the EMP_DETAILS_VIEW that joins the employees, jobs, 
REM departments, jobs, countries, and locations table to provide details
REM about employees.

Prompt ******  Creating EMP_DETAILS_VIEW view ...

CREATE OR REPLACE VIEW emp_details_view
  (employee_id,
   job_id,
   manager_id,
   department_id,
   location_id,
   country_id,
   first_name,
   last_name,
   salary,
   commission_pct,
   department_name,
   job_title,
   city,
   state_province,
   country_name,
   region_name)
AS SELECT
  e.employee_id, 
  e.job_id, 
  e.manager_id, 
  e.department_id,
  d.location_id,
  l.country_id,
  e.first_name,
  e.last_name,
  e.salary,
  e.commission_pct,
  d.department_name,
  j.job_title,
  l.city,
  l.state_province,
  c.country_name,
  r.region_name
FROM
  employees e,
  departments d,
  jobs j,
  locations l,
  countries c,
  regions r
WHERE e.department_id = d.department_id
  AND d.location_id = l.location_id
  AND l.country_id = c.country_id
  AND c.region_id = r.region_id
  AND j.job_id = e.job_id 
WITH READ ONLY;

COMMIT;

第三个表

马哥私房菜博客地址:https://github.com/mageSFC/myblog 

ALTER TABLE departments
DISABLE CONSTRAINT DEPT_MGR_FK;

ALTER TABLE job_history
DISABLE CONSTRAINT JHIST_EMP_FK;

DROP TRIGGER secure_employees;

DROP TRIGGER update_job_history;

DROP PROCEDURE add_job_history;

DROP PROCEDURE secure_dml;

DELETE FROM employees
WHERE manager_id IN (108, 114, 120, 121, 122, 123, 145, 146, 147, 148);

DELETE FROM employees
WHERE employee_id IN (114, 120, 121, 122, 123, 145, 146, 147, 148, 
                      196, 197, 198, 199, 105, 106, 108, 175, 177, 
                      179, 203, 204);

DELETE FROM locations
WHERE location_id NOT IN 
  (SELECT DISTINCT location_id
   FROM departments);

DELETE FROM countries
WHERE country_id NOT IN
  (SELECT country_id
   FROM locations);

DELETE FROM jobs
WHERE job_id NOT IN
  (SELECT job_id
   FROM employees);

DELETE FROM departments
WHERE department_id NOT IN 
  (SELECT DISTINCT department_id
   FROM employees
   WHERE department_id IS NOT NULL);

UPDATE departments
SET manager_id = 124
WHERE department_id = 50;

UPDATE departments
SET manager_id = 149
WHERE department_id = 80;

DELETE FROM locations
WHERE location_id IN (2700, 2400);

UPDATE locations
SET street_address = '460 Bloor St. W.', 
    postal_code = 'ON M5S 1X8'
WHERE location_id = 1800;

ALTER TABLE departments
ENABLE CONSTRAINT DEPT_MGR_FK;

CREATE TABLE job_grades
(grade_level VARCHAR2(3),
 lowest_sal  NUMBER,
 highest_sal NUMBER);

INSERT INTO job_grades
VALUES ('A', 1000, 2999);

INSERT INTO job_grades
VALUES ('B', 3000, 5999);

INSERT INTO job_grades
VALUES('C', 6000, 9999);

INSERT INTO job_grades
VALUES('D', 10000, 14999);

INSERT INTO job_grades
VALUES('E', 15000, 24999);

INSERT INTO job_grades
VALUES('F', 25000, 40000);

INSERT INTO departments VALUES 
        ( 190 
        , 'Contracting'
        , NULL
        , 1700
        );

COMMIT;

这里给出几个表直接的关系

马哥私房菜博客地址:https://github.com/mageSFC/myblog

马哥私房菜博客地址:<a href=https://github.com/mageSFC/myblog ” title=”” />

数据库表的导入?
使用sql developer,

执行 @D:/del_data.sql
执行 @D:/hr_cre.sql
执行 @D:/hr_popul.sql
这样就把sql文件导入到数据库里面了。

我们执行我们的项目,让其自动的再创建2长表,一个是hb_department, 另一个是hb_employee, 这个是在hbm配置文件里面定义的表格名的。

接下来我们把刚刚导入的sql文件中有一个departments表的数据 导入到这个hb_department表,

sql> INSERT  INTO  hb_department SELECT DEPARTMENT_ID,DEPARTMENT_NAME from DEPARTMENTS
[2018-01-15 16:32:52] 27 rows affected in 37ms

然后通用的操作包employees表格中的数据导入到hb_employee表,

sql> INSERT  INTO  HB_EMPLOYEE SELECT  EMPLOYEE_ID,LAST_NAME,SALARY,EMAIL,DEPARTMENT_ID FROM EMPLOYEES
[2018-01-15 16:34:36] 107 rows affected in 10ms

马哥私房菜博客地址:https://github.com/mageSFC/myblog

hibernate的配置文件。

<?xml version='1.0' encoding='utf-8'?>
<!DOCTYPE hibernate-configuration PUBLIC
        "-//Hibernate/Hibernate Configuration DTD//EN"
        "http://www.hibernate.org/dtd/hibernate-configuration-3.0.dtd">
<hibernate-configuration>
    <session-factory>
        <!--  链接数据库的基本信息 -->
        <property name="hibernate.connection.username">system</property>
        <property name="hibernate.connection.password">xxxx</property>
        <property name="hibernate.connection.driver_class">oracle.jdbc.driver.OracleDriver</property>
        <property name="hibernate.connection.url">jdbc:oracle:thin:@10.0.63.42:1521:orcl</property>
         马哥私房菜博客地址:https://github.com/mageSFC/myblog 
        <!-- 配置数据库的方言 -->
        <property name="hibernate.dialect">org.hibernate.dialect.Oracle10gDialect</property>
        <!--<property name="dialect">org.hibernate.dialect.MySQL5Dialect</property>-->
        <!--<property name="dialect">org.hibernate.dialect.MySQLInnoDBDialect</property>-->

        <!--是否打印sql语句-->
        <property name="show_sql">true</property>
         马哥私房菜博客地址:https://github.com/mageSFC/myblog 
        <!--是否对sql语句格式化-->
        <property name="format_sql">true</property>

        <!--指定自动生成数据库表的策略,有4个值-->
        <property name="hbm2ddl.auto">update</property>

        <property name="connection.isolation">2</property>
        <!--<property name="use_identifier_rollback">true</property>-->
        马哥私房菜博客地址:https://github.com/mageSFC/myblog 
        <!--&lt;!&ndash;C3P0配置 &ndash;&gt;-->
        <!--<property name="hibernate.connection.provider_class">-->
            <!--org.hibernate.service.jdbc.connections.internal.C3P0ConnectionProvider-->
        <!--</property>-->
        <!--<property name="hibernate.c3p0.max_size">10</property>-->
        <!--<property name="hibernate.c3p0.min_size">5</property>-->
        <!--<property name="hibernate.c3p0.acquire_increment">5</property>-->
        <!--<property name="hibernate.c3p0.timeout">20000</property>-->
        <!--<property name="hibernate.c3p0.idle_test_period">20000</property>-->
        <!--<property name="hibernate.c3p0.max_statements">2</property>-->

         马哥私房菜博客地址:https://github.com/mageSFC/myblog 
        <property name="hibernate.jdbc.fetch_size">100</property>
        <property name="hibernate.jdbc.batch_size">30</property>

         马哥私房菜博客地址:https://github.com/mageSFC/myblog 
        <mapping resource="Department.hbm.xml"/>
        <mapping resource="Employee.hbm.xml"/>

         马哥私房菜博客地址:https://github.com/mageSFC/myblog 
    </session-factory>
</hibernate-configuration>
<?xml version='1.0' encoding='utf-8'?>
<!DOCTYPE hibernate-mapping PUBLIC
        "-//Hibernate/Hibernate Mapping DTD 3.0//EN"
        "http://www.hibernate.org/dtd/hibernate-mapping-3.0.dtd">


<hibernate-mapping package="com.mamh.hibernate.hql.entities">
     马哥私房菜博客地址:https://github.com/mageSFC/myblog 
    <class name="Employee" table="hb_employee">
        <id name="id" type="java.lang.Integer">
            <column name="id"/>
            <generator class="native"/>
        </id>

        <property name="name" type="java.lang.String">
            <column name="name"/>
        </property>


        <property name="salary" type="float">
            <column name="salary"/>
        </property>

        <property name="email" type="java.lang.String">
            <column name="email"/>
        </property>

        <many-to-one name="dept" class="Department">
            <column name="dept_id"/>
        </many-to-one>

       马哥私房菜博客地址:https://github.com/mageSFC/myblog 
    </class>
</hibernate-mapping>
<?xml version='1.0' encoding='utf-8'?>
<!DOCTYPE hibernate-mapping PUBLIC
        "-//Hibernate/Hibernate Mapping DTD 3.0//EN"
        "http://www.hibernate.org/dtd/hibernate-mapping-3.0.dtd">

        马哥私房菜博客地址:https://github.com/mageSFC/myblog 
<hibernate-mapping package="com.mamh.hibernate.hql.entities">

    <class name="Department" table="hb_department">
        <id name="id"  type="java.lang.String">
            <column name="id" />
            <generator class="native"/>
        </id>
        <property name="name" type="java.lang.String">
            <column name="name"/>
        </property>

        <set name="emps" table="hb_employee" inverse="false" lazy="true">
            <key>
                <column name="dept_id"/>
            </key>
            <one-to-many class="Employee"/>
        </set>
    </class>
</hibernate-mapping>
         马哥私房菜博客地址:https://github.com/mageSFC/myblog 

马哥私房菜博客地址:https://github.com/mageSFC/myblog

package com.mamh.hibernate.hql.entities;

马哥私房菜博客地址:https://github.com/mageSFC/myblog 
public class Department {
    private String id;
    private String name;

    private Set<Employee> emps = new HashSet<Employee>();

    public String getId() {
        return id;
    }

    public void setId(String id) {
        this.id = id;
    }

    public String getName() {
        return name;
    }

    public void setName(String name) {
        this.name = name;
    }

    public Set<Employee> getEmps() {
        return emps;
    }

    public void setEmps(Set<Employee> emps) {
        this.emps = emps;
    }

    @Override
    public String toString() {
        return "Department{" +
                "id='" + id + '\'' +
                ", name='" + name + '\'' +
                '}';
    }
}

马哥私房菜博客地址:https://github.com/mageSFC/myblog

package com.mamh.hibernate.hql.entities;

马哥私房菜博客地址:https://github.com/mageSFC/myblog 
public class Employee {
    private Integer id;
    private String name;
    private float salary;
    private String email;


    private Department dept;


    public Integer getId() {
        return id;
    }

    public void setId(Integer id) {
        this.id = id;
    }

    public String getName() {
        return name;
    }

    public void setName(String name) {
        this.name = name;
    }

    public float getSalary() {
        return salary;
    }

    public void setSalary(float salary) {
        this.salary = salary;
    }

    public String getEmail() {
        return email;
    }

    public void setEmail(String email) {
        this.email = email;
    }

    public Department getDept() {
        return dept;
    }

    public void setDept(Department dept) {
        this.dept = dept;
    }

    @Override
    public String toString() {
        return "Employee{" +
                "id=" + id +
                ", name='" + name + '\'' +
                ", salary=" + salary +
                ", email='" + email + '\'' +
                ", dept=" + dept +
                '}'+'\n';
    }
}

马哥私房菜博客地址:https://github.com/mageSFC/myblog

这2个类都是比较简单的。


马哥私房菜博客地址:https://github.com/mageSFC/myblog

package com.mamh.hibernate.demo;


public class HibernateHqlTest {
    SessionFactory sessionFactory = null;
    private Session session = null;
    private Transaction transaction = null;


    @Before
    public void init() {
        System.out.println("=init=");
        //1.创建一个SessionFactory 对象,创建session的工厂的一个类
        //1.1创建一个Configuration对象,对应hibernate的基本配置信息和对象关系映射信息
        Configuration configuration = new Configuration().configure();
        //1.2创建一个ServiceRegistry对象,hibernate4.x新添加的对象,hibernate任何的配置和服务都要在该对象中注册才能有效。
        ServiceRegistry serviceRegistry = new ServiceRegistryBuilder().applySettings(configuration.getProperties()).buildServiceRegistry();
        //1.3
        sessionFactory = configuration.buildSessionFactory(serviceRegistry);
        //2.创建一个Session对象,这个和jdbc中的connection很类似
        session = sessionFactory.openSession();
        //3.开启事务
        transaction = session.beginTransaction();
    }



    @After
    public void destroy() {
        System.out.println("=destroy=");
        //5.提交事务
        transaction.commit();

        //6.关闭session对象
        session.close();

        //7.关闭SessionFactory 对象
        sessionFactory.close();

    }

}

马哥私房菜博客地址:https://github.com/mageSFC/myblog 

添加测试hql查询类

马哥私房菜博客地址:https://github.com/mageSFC/myblog 
    @Test
    public void testHQL(){

        String hql = "from Employee e where e.salary > ? and e.email like ?";
        Query query = session.createQuery(hql);

        query.setFloat(0, 6000).setString(1, "%A%");

        List list = query.list();
        System.out.println(list);


    }




马哥私房菜博客地址:https://github.com/mageSFC/myblog


Hibernate: 
    select
        employee0_.id as id1_,
        employee0_.name as name1_,
        employee0_.salary as salary1_,
        employee0_.email as email1_,
        employee0_.dept_id as dept5_1_ 
    from
        hb_employee employee0_ 
    where
        employee0_.salary>? 
        and (
            employee0_.email like ?
        )
Hibernate: 
    select
        department0_.id as id0_0_,
        department0_.name as name0_0_ 
    from
        hb_department department0_ 
    where
        department0_.id=?
Hibernate: 
    select
        department0_.id as id0_0_,
        department0_.name as name0_0_ 
    from
        hb_department department0_ 
    where
        department0_.id=?
Hibernate: 
    select
        department0_.id as id0_0_,
        department0_.name as name0_0_ 
    from
        hb_department department0_ 
    where
        department0_.id=?
Hibernate: 
    select
        department0_.id as id0_0_,
        department0_.name as name0_0_ 
    from
        hb_department department0_ 
    where
        department0_.id=?
Hibernate: 
    select
        department0_.id as id0_0_,
        department0_.name as name0_0_ 
    from
        hb_department department0_ 
    where
        department0_.id=?
Hibernate: 
    select
        department0_.id as id0_0_,
        department0_.name as name0_0_ 
    from
        hb_department department0_ 
    where
        department0_.id=?
Hibernate: 
    select
        department0_.id as id0_0_,
        department0_.name as name0_0_ 
    from
        hb_department department0_ 
    where
        department0_.id=?
Hibernate: 
    select
        department0_.id as id0_0_,
        department0_.name as name0_0_ 
    from
        hb_department department0_ 
    where
        department0_.id=?
Hibernate: 
    select
        department0_.id as id0_0_,
        department0_.name as name0_0_ 
    from
        hb_department department0_ 
    where
        department0_.id=?
[Employee{id=101, name='Kochhar', salary=17000.0, email='NKOCHHAR', dept=Department{id='90', name='Executive'}}
, Employee{id=102, name='De Haan', salary=17000.0, email='LDEHAAN', dept=Department{id='90', name='Executive'}}
, Employee{id=103, name='Hunold', salary=9000.0, email='AHUNOLD', dept=Department{id='60', name='IT'}}
, Employee{id=109, name='Faviet', salary=9000.0, email='DFAVIET', dept=Department{id='100', name='Finance'}}
, Employee{id=111, name='Sciarra', salary=7700.0, email='ISCIARRA', dept=Department{id='100', name='Finance'}}
, Employee{id=112, name='Urman', salary=7800.0, email='JMURMAN', dept=Department{id='100', name='Finance'}}
, Employee{id=114, name='Raphaely', salary=11000.0, email='DRAPHEAL', dept=Department{id='30', name='Purchasing'}}
, Employee{id=121, name='Fripp', salary=8200.0, email='AFRIPP', dept=Department{id='50', name='Shipping'}}
, Employee{id=122, name='Kaufling', salary=7900.0, email='PKAUFLIN', dept=Department{id='50', name='Shipping'}}
, Employee{id=123, name='Vollman', salary=6500.0, email='SVOLLMAN', dept=Department{id='50', name='Shipping'}}
, Employee{id=146, name='Partners', salary=13500.0, email='KPARTNER', dept=Department{id='80', name='Sales'}}
, Employee{id=147, name='Errazuriz', salary=12000.0, email='AERRAZUR', dept=Department{id='80', name='Sales'}}
, Employee{id=148, name='Cambrault', salary=11000.0, email='GCAMBRAU', dept=Department{id='80', name='Sales'}}
, Employee{id=152, name='Hall', salary=9000.0, email='PHALL', dept=Department{id='80', name='Sales'}}
, Employee{id=154, name='Cambrault', salary=7500.0, email='NCAMBRAU', dept=Department{id='80', name='Sales'}}
, Employee{id=155, name='Tuvault', salary=7000.0, email='OTUVAULT', dept=Department{id='80', name='Sales'}}
, Employee{id=158, name='McEwen', salary=9000.0, email='AMCEWEN', dept=Department{id='80', name='Sales'}}
, Employee{id=160, name='Doran', salary=7500.0, email='LDORAN', dept=Department{id='80', name='Sales'}}
, Employee{id=161, name='Sewall', salary=7000.0, email='SSEWALL', dept=Department{id='80', name='Sales'}}
, Employee{id=164, name='Marvins', salary=7200.0, email='MMARVINS', dept=Department{id='80', name='Sales'}}
, Employee{id=166, name='Ande', salary=6400.0, email='SANDE', dept=Department{id='80', name='Sales'}}
, Employee{id=167, name='Banda', salary=6200.0, email='ABANDA', dept=Department{id='80', name='Sales'}}
, Employee{id=172, name='Bates', salary=7300.0, email='EBATES', dept=Department{id='80', name='Sales'}}
, Employee{id=173, name='Kumar', salary=6100.0, email='SKUMAR', dept=Department{id='80', name='Sales'}}
, Employee{id=174, name='Abel', salary=11000.0, email='EABEL', dept=Department{id='80', name='Sales'}}
, Employee{id=175, name='Hutton', salary=8800.0, email='AHUTTON', dept=Department{id='80', name='Sales'}}
, Employee{id=176, name='Taylor', salary=8600.0, email='JTAYLOR', dept=Department{id='80', name='Sales'}}
, Employee{id=178, name='Grant', salary=7000.0, email='KGRANT', dept=null}
, Employee{id=201, name='Hartstein', salary=13000.0, email='MHARTSTE', dept=Department{id='20', name='Marketing'}}
, Employee{id=203, name='Mavris', salary=6500.0, email='SMAVRIS', dept=Department{id='40', name='Human Resources'}}
, Employee{id=204, name='Baer', salary=10000.0, email='HBAER', dept=Department{id='70', name='Public Relations'}}
]
马哥私房菜博客地址:https://github.com/mageSFC/myblog 
=destroy=

马哥私房菜博客地址:https://github.com/mageSFC/myblog

参数使用命名参数的一种形式

马哥私房菜博客地址:https://github.com/mageSFC/myblog 
    @Test
    public void testHQL1() {  //参数使用命名参数
        String hql = "from Employee e where e.salary > :sal and e.email like :email";
        Query query = session.createQuery(hql);


        query.setFloat("sal", 6000).setString("email", "%B%");

        List list = query.list();
        System.out.println(list);
        System.out.println(list.size());
    }

马哥私房菜博客地址:https://github.com/mageSFC/myblog

还可以使用order by

马哥私房菜博客地址:https://github.com/mageSFC/myblog 
    @Test
    public void testHQL1() {  //参数使用命名参数
        String hql = "from Employee e where e.salary > :sal and e.email like :email  order by e.salary";
        Query query = session.createQuery(hql);


        query.setFloat("sal", 6000).setString("email", "%B%");

        List list = query.list();
        System.out.println(list);
        System.out.println(list.size());
    }

Hibernate:马哥私房菜博客地址:https://github.com/mageSFC/myblog  

    select
        employee0_.id as id1_,
        employee0_.name as name1_,
        employee0_.salary as salary1_,
        employee0_.email as email1_,
        employee0_.dept_id as dept5_1_ 
    from
        hb_employee employee0_ 
    where
        employee0_.salary>? 
        and (
            employee0_.email like ?
        ) 
    order by
        employee0_.salary

使用实体类作为查询条件

马哥私房菜博客地址:https://github.com/mageSFC/myblog

    @Test
    public void testHQL1() {  //参数使用命名参数
        String hql = "from Employee e where e.salary > :sal and e.email like :email and e.dept = :dept order by e.salary";
        Query query = session.createQuery(hql);


        Department dept=new Department();
        dept.setId(80);
        query.setFloat("sal", 6000)
                .setString("email", "%B%")
                .setEntity("dept", dept);

        List list = query.list();
        //System.out.println(list);
        System.out.println(list.size());
    }
Hibernate:  马哥私房菜博客地址:https://github.com/mageSFC/myblog 
    select
        employee0_.id as id1_,
        employee0_.name as name1_,
        employee0_.salary as salary1_,
        employee0_.email as email1_,
        employee0_.dept_id as dept5_1_ 
    from
        hb_employee employee0_ 
    where
        employee0_.salary>? 
        and (
            employee0_.email like ?
        ) 
        and employee0_.dept_id=? 
    order by
        employee0_.salary
7
=destroy=

马哥私房菜博客地址:https://github.com/mageSFC/myblog


分页查询
setFirstResult(int firstResult)设定从哪一个对象开始检索,参数 firstResult 表示这个对象在查询结果中的索引位置,
索引位置的起始值是0,默认情况下query从查询结果中的第一个对象开始检索。

setMaxResult(int maxResult)设定一次最多检索出的对象数目,在默认情况下,query和criteria接口检索出查询结果中所有的对象

马哥私房菜博客地址:https://github.com/mageSFC/myblog


    @Test
    public void testHQL2(){//分页查询
        String hql = "from Employee";
        Query query = session.createQuery(hql);
        int pageNo = 3;
        int pageSize = 5;

        query.setFirstResult((pageNo - 1) * pageSize);
        query.setMaxResults(pageSize);
        List list = query.list();
        System.out.println(list);


    }




Hibernate: 马哥私房菜博客地址:https://github.com/mageSFC/myblog 
    select
        * 
    from
        ( select row_.*, rownum rownum_ 
        from
            ( select  employee0_.id as id1_, employee0_.name as name1_,
             employee0_.salary as salary1_, employee0_.email as email1_, 
             employee0_.dept_id as dept5_1_ 
            from
                hb_employee employee0_ ) row_ 
        where
            rownum <= ?
        ) 
    where
        rownum_ > ?
Hibernate:马哥私房菜博客地址:https://github.com/mageSFC/myblog  
    select
        department0_.id as id0_0_, department0_.name as name0_0_ 
    from
        hb_department department0_ 
    where
        department0_.id=?
Hibernate: 
    select
        department0_.id as id0_0_,department0_.name as name0_0_ 
    from
        hb_department department0_ 
    where
        department0_.id=?

[Employee{id=110, name='Chen', salary=8200.0, email='JCHEN', dept=Department{id='100', name='Finance'}}
, Employee{id=111, name='Sciarra', salary=7700.0, email='ISCIARRA', dept=Department{id='100', name='Finance'}}
, Employee{id=112, name='Urman', salary=7800.0, email='JMURMAN', dept=Department{id='100', name='Finance'}}
, Employee{id=113, name='Popp', salary=6900.0, email='LPOPP', dept=Department{id='100', name='Finance'}}
, Employee{id=114, name='Raphaely', salary=11000.0, email='DRAPHEAL', dept=Department{id='30', name='Purchasing'}}
]


马哥私房菜博客地址:https://github.com/mageSFC/myblog 
从第三页开始,每页显示5条记录。这里结果正好是从110id开始的。

命名查询
一个和class并列的一个标签 query标签
可以把hql语句配置到xml配置文件中

马哥私房菜博客地址:https://github.com/mageSFC/myblog


<hibernate-mapping package="com.mamh.hibernate.hql.entities">

    <class name="Employee" table="hb_employee">
        <id name="id" type="java.lang.Integer">
            <column name="id"/>
            <generator class="native"/>
        </id>

        <property name="name" type="java.lang.String">
            <column name="name"/>
        </property>


        <property name="salary" type="float">
            <column name="salary"/>
        </property>

        <property name="email" type="java.lang.String">
            <column name="email"/>
        </property>

        <many-to-one name="dept" class="Department">
            <column name="dept_id"/>
        </many-to-one>


    </class>


    <query name="salary">
        <![
        CDATA[
            FROM Employee  e where e.salary > :minSalary and e.salary < :maxSalary
            ]
        ]
        >
    </query>
</hibernate-mapping>
    @Test
    public void testHQL3() {
        Query query = session.getNamedQuery("salary");

        query.setFloat("minSalary", 5000);
        query.setFloat("maxSalary", 10000);

        List list = query.list();
        System.out.println(list.size());
    }

投影查询
查询结果仅包含实体的部分属性
“`
String hql = “” +
“select new Employee (e.id, e.name,e.salary,e.email) ” +
“from Employee e ” +
“where e.dept= :dept”;

关键就是这里的hql语句,用的了    Employee 的带参数的一个构造方法


@Test
public void testHQL4() {
    String hql = "select e.email,e.salary from Employee e where e.dept= :dept";
    Query query = session.createQuery(hql);

    Department dept = new Department();
    dept.setId(80);
    List<Object []> list = query.setEntity("dept", dept).list();

    for (Object[] o : list) {
        System.out.println(Arrays.asList(o));
    }

}
马哥私房菜博客地址:https://github.com/mageSFC/myblog 

Hibernate:
select
employee0_.email as col_0_0_,
employee0_.salary as col_1_0_
from
hb_employee employee0_
where
employee0_.dept_id=?
[JRUSSEL, 14000.0]
[KPARTNER, 13500.0]
[AERRAZUR, 12000.0]
[GCAMBRAU, 11000.0]
[EZLOTKEY, 10500.0]
[PTUCKER, 10000.0]
[DBERNSTE, 9500.0]
[PHALL, 9000.0]
[COLSEN, 8000.0]
[NCAMBRAU, 7500.0]
[OTUVAULT, 7000.0]
[JKING, 10000.0]
[PSULLY, 9500.0]
[AMCEWEN, 9000.0]
[LSMITH, 8000.0]
[LDORAN, 7500.0]
[SSEWALL, 7000.0]
[CVISHNEY, 10500.0]
[DGREENE, 9500.0]
[MMARVINS, 7200.0]
[DLEE, 6800.0]
[SANDE, 6400.0]
[ABANDA, 6200.0]
[LOZER, 11500.0]
[HBLOOM, 10000.0]
[TFOX, 9600.0]
[WSMITH, 7400.0]
[EBATES, 7300.0]
[SKUMAR, 6100.0]
[EABEL, 11000.0]
[AHUTTON, 8800.0]
[JTAYLOR, 8600.0]
[JLIVINGS, 8400.0]
[CJOHNSON, 6200.0]

马哥私房菜博客地址:https://github.com/mageSFC/myblog 
@Test
public void testHQL4() {
    String hql = "" +
            "select new Employee (e.id, e.name,e.salary,e.email) " +
            "from Employee e " +
            "where e.dept= :dept";
    Query query = session.createQuery(hql);

    Department dept = new Department();
    dept.setId(80);

    List<Employee> list = query.setEntity("dept", dept).list();

    for (Employee o : list) {
        System.out.println(o);
    }

}

public class Employee {
private Integer id;
private String name;
private float salary;
private String email;

private Department dept;


public Employee() {
}

public Employee(Integer id, String name, float salary, String email) {
    this.id = id;
    this.name = name;
    this.salary = salary;
    this.email = email;
}
马哥私房菜博客地址:https://github.com/mageSFC/myblog 

Hibernate:
select
employee0_.id as col_0_0_,
employee0_.name as col_1_0_,
employee0_.salary as col_2_0_,
employee0_.email as col_3_0_
from
hb_employee employee0_
where
employee0_.dept_id=?

Employee{id=145, name=’Russell’, salary=14000.0, email=’JRUSSEL’, dept=null}

Employee{id=146, name=’Partners’, salary=13500.0, email=’KPARTNER’, dept=null}

Employee{id=147, name=’Errazuriz’, salary=12000.0, email=’AERRAZUR’, dept=null}

Employee{id=148, name=’Cambrault’, salary=11000.0, email=’GCAMBRAU’, dept=null}

Employee{id=149, name=’Zlotkey’, salary=10500.0, email=’EZLOTKEY’, dept=null}

Employee{id=150, name=’Tucker’, salary=10000.0, email=’PTUCKER’, dept=null}

Employee{id=151, name=’Bernstein’, salary=9500.0, email=’DBERNSTE’, dept=null}

Employee{id=152, name=’Hall’, salary=9000.0, email=’PHALL’, dept=null}

Employee{id=153, name=’Olsen’, salary=8000.0, email=’COLSEN’, dept=null}

Employee{id=154, name=’Cambrault’, salary=7500.0, email=’NCAMBRAU’, dept=null}

Employee{id=155, name=’Tuvault’, salary=7000.0, email=’OTUVAULT’, dept=null}

Employee{id=156, name=’King’, salary=10000.0, email=’JKING’, dept=null}

Employee{id=157, name=’Sully’, salary=9500.0, email=’PSULLY’, dept=null}

Employee{id=158, name=’McEwen’, salary=9000.0, email=’AMCEWEN’, dept=null}

Employee{id=159, name=’Smith’, salary=8000.0, email=’LSMITH’, dept=null}

Employee{id=160, name=’Doran’, salary=7500.0, email=’LDORAN’, dept=null}

Employee{id=161, name=’Sewall’, salary=7000.0, email=’SSEWALL’, dept=null}

Employee{id=162, name=’Vishney’, salary=10500.0, email=’CVISHNEY’, dept=null}

Employee{id=163, name=’Greene’, salary=9500.0, email=’DGREENE’, dept=null}

Employee{id=164, name=’Marvins’, salary=7200.0, email=’MMARVINS’, dept=null}

Employee{id=165, name=’Lee’, salary=6800.0, email=’DLEE’, dept=null}

Employee{id=166, name=’Ande’, salary=6400.0, email=’SANDE’, dept=null}

Employee{id=167, name=’Banda’, salary=6200.0, email=’ABANDA’, dept=null}

Employee{id=168, name=’Ozer’, salary=11500.0, email=’LOZER’, dept=null}

Employee{id=169, name=’Bloom’, salary=10000.0, email=’HBLOOM’, dept=null}

Employee{id=170, name=’Fox’, salary=9600.0, email=’TFOX’, dept=null}

Employee{id=171, name=’Smith’, salary=7400.0, email=’WSMITH’, dept=null}

Employee{id=172, name=’Bates’, salary=7300.0, email=’EBATES’, dept=null}

Employee{id=173, name=’Kumar’, salary=6100.0, email=’SKUMAR’, dept=null}

Employee{id=174, name=’Abel’, salary=11000.0, email=’EABEL’, dept=null}

Employee{id=175, name=’Hutton’, salary=8800.0, email=’AHUTTON’, dept=null}

Employee{id=176, name=’Taylor’, salary=8600.0, email=’JTAYLOR’, dept=null}

Employee{id=177, name=’Livingston’, salary=8400.0, email=’JLIVINGS’, dept=null}

Employee{id=179, name=’Johnson’, salary=6200.0, email=’CJOHNSON’, dept=null}

=destroy=

Process finished with exit code 0

马哥私房菜博客地址:https://github.com/mageSFC/myblog 


报表查询

可以使用group by,having,聚集函数等
@Test
public void testHQLGroupBy() {
    String hql = "select min(e.salary), max(e.salary) " +
            "from Employee e " +
            "group by e.dept " +
            "having min(salary) > :minSalary";
    Query query = session.createQuery(hql);

    query.setFloat("minSalary", 8000);

    List<Object[]> list = query.list();

    for (Object[] o : list) {
        System.out.println(Arrays.asList(o));
    }


}
马哥私房菜博客地址:https://github.com/mageSFC/myblog 

Hibernate:
select
min(employee0_.salary) as col_0_0_,
max(employee0_.salary) as col_1_0_
from
hb_employee employee0_
group by
employee0_.dept_id
having
min(employee0_.salary)>?
[8300.0, 12000.0]
[17000.0, 24000.0]
[10000.0, 10000.0]
=destroy=

Process finished with exit code 0

----------
HQL(迫切)左外链接

马哥私房菜博客地址:https://github.com/mageSFC/myblog 

迫切左外链接:使用left join fetch 关键字
list()方法返回的集合中存放实体对象引用,每个department对象关联的employee集合都被
初始化,存放所有关联的employee的实体对象
查询结果中可能会有重复元素,可以通过hashset来过滤重复元素

马哥私房菜博客地址:https://github.com/mageSFC/myblog 

左外链接:使用left join 关键字
list()方法返回的集合中存放的是对象数组类型
根据配置文件来决定employee结合的检索策略
如果希望list()方法返回的集合中仅包含department对象,可以在hql查询语句中使用select关键字

马哥私房菜博客地址:https://github.com/mageSFC/myblog 
@Test
public void testHQLLeftJoin(){
    String hql="from Department d left join fetch d.emps";
    Query query = session.createQuery(hql);

    List list = query.list();
    System.out.println(list);
    System.out.println(list.size());

}
马哥私房菜博客地址:https://github.com/mageSFC/myblog 

Hibernate:
select
department0_.id as id0_0_,
emps1_.id as id1_1_,
department0_.name as name0_0_,
emps1_.name as name1_1_,
emps1_.salary as salary1_1_,
emps1_.email as email1_1_,
emps1_.dept_id as dept5_1_1_,
emps1_.dept_id as dept5_0_0__,
emps1_.id as id0__
from
hb_department department0_
left outer join
hb_employee emps1_
on department0_.id=emps1_.dept_id

[Department{id=’90’, name=’Executive’}
, Department{id=’90’, name=’Executive’}
, Department{id=’90’, name=’Executive’}
, Department{id=’60’, name=’IT’}
, Department{id=’60’, name=’IT’}
, Department{id=’60’, name=’IT’}
, Department{id=’60’, name=’IT’}
, Department{id=’60’, name=’IT’}
, Department{id=’100’, name=’Finance’}
, Department{id=’100’, name=’Finance’}
, Department{id=’100’, name=’Finance’}
, Department{id=’100’, name=’Finance’}
, Department{id=’100’, name=’Finance’}
, Department{id=’100’, name=’Finance’}
, Department{id=’30’, name=’Purchasing’}
, Department{id=’30’, name=’Purchasing’}
, Department{id=’30’, name=’Purchasing’}
, Department{id=’30’, name=’Purchasing’}
, Department{id=’30’, name=’Purchasing’}
, Department{id=’30’, name=’Purchasing’}
, Department{id=’50’, name=’Shipping’}
, Department{id=’50’, name=’Shipping’}
, Department{id=’50’, name=’Shipping’}
, Department{id=’50’, name=’Shipping’}
, Department{id=’50’, name=’Shipping’}
, Department{id=’50’, name=’Shipping’}
, Department{id=’50’, name=’Shipping’}
, Department{id=’50’, name=’Shipping’}
, Department{id=’50’, name=’Shipping’}
, Department{id=’50’, name=’Shipping’}
, Department{id=’50’, name=’Shipping’}
, Department{id=’50’, name=’Shipping’}
, Department{id=’50’, name=’Shipping’}
, Department{id=’50’, name=’Shipping’}
, Department{id=’50’, name=’Shipping’}
, Department{id=’50’, name=’Shipping’}
, Department{id=’50’, name=’Shipping’}
, Department{id=’50’, name=’Shipping’}
, Department{id=’50’, name=’Shipping’}
, Department{id=’50’, name=’Shipping’}
, Department{id=’50’, name=’Shipping’}
, Department{id=’50’, name=’Shipping’}
, Department{id=’50’, name=’Shipping’}
, Department{id=’50’, name=’Shipping’}
, Department{id=’50’, name=’Shipping’}
, Department{id=’80’, name=’Sales’}
, Department{id=’80’, name=’Sales’}
, Department{id=’80’, name=’Sales’}
, Department{id=’80’, name=’Sales’}
, Department{id=’80’, name=’Sales’}
, Department{id=’80’, name=’Sales’}
, Department{id=’80’, name=’Sales’}
, Department{id=’80’, name=’Sales’}
, Department{id=’80’, name=’Sales’}
, Department{id=’80’, name=’Sales’}
, Department{id=’80’, name=’Sales’}
, Department{id=’80’, name=’Sales’}
, Department{id=’80’, name=’Sales’}
, Department{id=’80’, name=’Sales’}
, Department{id=’80’, name=’Sales’}
, Department{id=’80’, name=’Sales’}
, Department{id=’80’, name=’Sales’}
, Department{id=’80’, name=’Sales’}
, Department{id=’80’, name=’Sales’}
, Department{id=’80’, name=’Sales’}
, Department{id=’80’, name=’Sales’}
, Department{id=’80’, name=’Sales’}
, Department{id=’80’, name=’Sales’}
, Department{id=’80’, name=’Sales’}
, Department{id=’80’, name=’Sales’}
, Department{id=’80’, name=’Sales’}
, Department{id=’80’, name=’Sales’}
, Department{id=’80’, name=’Sales’}
, Department{id=’80’, name=’Sales’}
, Department{id=’80’, name=’Sales’}
, Department{id=’80’, name=’Sales’}
, Department{id=’80’, name=’Sales’}
, Department{id=’80’, name=’Sales’}
, Department{id=’80’, name=’Sales’}
, Department{id=’50’, name=’Shipping’}
, Department{id=’50’, name=’Shipping’}
, Department{id=’50’, name=’Shipping’}
, Department{id=’50’, name=’Shipping’}
, Department{id=’50’, name=’Shipping’}
, Department{id=’50’, name=’Shipping’}
, Department{id=’50’, name=’Shipping’}
, Department{id=’50’, name=’Shipping’}
, Department{id=’50’, name=’Shipping’}
, Department{id=’50’, name=’Shipping’}
, Department{id=’50’, name=’Shipping’}
, Department{id=’50’, name=’Shipping’}
, Department{id=’50’, name=’Shipping’}
, Department{id=’50’, name=’Shipping’}
, Department{id=’50’, name=’Shipping’}
, Department{id=’50’, name=’Shipping’}
, Department{id=’50’, name=’Shipping’}
, Department{id=’50’, name=’Shipping’}
, Department{id=’50’, name=’Shipping’}
, Department{id=’50’, name=’Shipping’}
, Department{id=’10’, name=’Administration’}
, Department{id=’20’, name=’Marketing’}
, Department{id=’20’, name=’Marketing’}
, Department{id=’40’, name=’Human Resources’}
, Department{id=’70’, name=’Public Relations’}
, Department{id=’110’, name=’Accounting’}
, Department{id=’110’, name=’Accounting’}
, Department{id=’160’, name=’Benefits’}
, Department{id=’200’, name=’Operations’}
, Department{id=’230’, name=’IT Helpdesk’}
, Department{id=’140’, name=’Control And Credit’}
, Department{id=’130’, name=’Corporate Tax’}
, Department{id=’240’, name=’Government Sales’}
, Department{id=’170’, name=’Manufacturing’}
, Department{id=’210’, name=’IT Support’}
, Department{id=’180’, name=’Construction’}
, Department{id=’150’, name=’Shareholder Services’}
, Department{id=’260’, name=’Recruiting’}
, Department{id=’220’, name=’NOC’}
, Department{id=’120’, name=’Treasury’}
, Department{id=’190’, name=’Contracting’}
, Department{id=’250’, name=’Retail Sales’}
, Department{id=’270’, name=’Payroll’}
]
122
=destroy=

马哥私房菜博客地址:https://github.com/mageSFC/myblog 

上面的hql发现有重复的数据我们来修改一下hql语句
@Test
public void testHQLLeftJoin(){
    String hql="select distinct d from  Department d left join fetch d.emps";
    Query query = session.createQuery(hql);

    List list = query.list();
    System.out.println(list);
    System.out.println(list.size());

}
马哥私房菜博客地址:https://github.com/mageSFC/myblog 

Hibernate:
select
distinct department0_.id as id0_0_,
emps1_.id as id1_1_,
department0_.name as name0_0_,
emps1_.name as name1_1_,
emps1_.salary as salary1_1_,
emps1_.email as email1_1_,
emps1_.dept_id as dept5_1_1_,
emps1_.dept_id as dept5_0_0__,
emps1_.id as id0__
from
hb_department department0_
left outer join
hb_employee emps1_
on department0_.id=emps1_.dept_id
[Department{id=’90’, name=’Executive’}
, Department{id=’60’, name=’IT’}
, Department{id=’100’, name=’Finance’}
, Department{id=’30’, name=’Purchasing’}
, Department{id=’50’, name=’Shipping’}
, Department{id=’80’, name=’Sales’}
, Department{id=’10’, name=’Administration’}
, Department{id=’20’, name=’Marketing’}
, Department{id=’40’, name=’Human Resources’}
, Department{id=’70’, name=’Public Relations’}
, Department{id=’110’, name=’Accounting’}
, Department{id=’160’, name=’Benefits’}
, Department{id=’200’, name=’Operations’}
, Department{id=’230’, name=’IT Helpdesk’}
, Department{id=’140’, name=’Control And Credit’}
, Department{id=’130’, name=’Corporate Tax’}
, Department{id=’240’, name=’Government Sales’}
, Department{id=’170’, name=’Manufacturing’}
, Department{id=’210’, name=’IT Support’}
, Department{id=’180’, name=’Construction’}
, Department{id=’150’, name=’Shareholder Services’}
, Department{id=’260’, name=’Recruiting’}
, Department{id=’220’, name=’NOC’}
, Department{id=’120’, name=’Treasury’}
, Department{id=’190’, name=’Contracting’}
, Department{id=’250’, name=’Retail Sales’}
, Department{id=’270’, name=’Payroll’}
]
27
=destroy=

马哥私房菜博客地址:https://github.com/mageSFC/myblog 

另外一种去除重复元素的方法
把list类型通过hashset过滤一下
@Test
public void testHQLLeftJoin1() {
    String hql = "from  Department d left join fetch d.emps";
    Query query = session.createQuery(hql);

    List list = query.list();
    System.out.println(new LinkedHashSet(list).size());
    System.out.println(list.size());

}

马哥私房菜博客地址:https://github.com/mageSFC/myblog 


左外链接:使用left join 关键字
list()方法返回的集合中存放的是对象数组类型
根据配置文件来决定employee结合的检索策略
如果希望list()方法返回的集合中仅包含department对象,
可以在hql查询语句中使用select关键字


马哥私房菜博客地址:https://github.com/mageSFC/myblog 
/**
 * 左外链接:使用left join 关键字
 * list()方法返回的集合中存放的是对象数组类型
 * 根据配置文件来决定employee结合的检索策略
 * 如果希望list()方法返回的集合中仅包含department对象,
 * 可以在hql查询语句中使用select关键字
 */
@Test
public void testHQLLeftJoin2() {
    String hql = "from  Department d left join d.emps";
    Query query = session.createQuery(hql);

    List<Object[]> list = query.list();
    System.out.println(list.size());

    for (Object[] o : list) { //这个出来的是一个数组
        System.out.println(Arrays.asList(o));
    }

    //        System.out.println(list.get(0)[0]);
    //        System.out.println(list.get(0)[1]);


}
马哥私房菜博客地址:https://github.com/mageSFC/myblog 
@Test
public void testHQLLeftJoin2() {
    String hql = "select  distinct  d from  Department d left join d.emps";
    Query query = session.createQuery(hql);

    List<Department> list = query.list();
    System.out.println(list.size());

    for (Department department : list) {
        System.out.println(department);
        //System.out.println(department.getEmps());  这里关联department的employee没有被初始化
    }

}

马哥私房菜博客地址:https://github.com/mageSFC/myblog 

----------
HQL (迫切)内连接

迫切内连接:    不返回左表不满足条件的记录
inner join fetch关键字表示迫切内连接,也可以省略inner关键字
list()方法返回的集合中存放department对象的引用,每个department对象的employee集合都被初始化
存放所有关联的employee对象

马哥私房菜博客地址:https://github.com/mageSFC/myblog 

内连接:
inner join关键字表示内连接,也可以省略inner关键字
list()方法的集合中存放的每个元素对应查询结果的一条记录,每个元素都是对象数组类型
如果希望list()方法返回的集合仅包含department对象,可以在HQL查询语句中使用select关键字

马哥私房菜博客地址:https://github.com/mageSFC/myblog 
/**
 * 迫切内连接:
 * inner join fetch关键字表示迫切内连接,也可以省略inner关键字
 * list()方法返回的集合中存放department对象的引用,每个department对象的employee集合都被初始化
 * 存放所有关联的employee对象
 * <p>
 * 不返回左表不满足条件的记录
 */
@Test
public void testHQLInnerJoin() {
    String hql = "select  distinct d from  Department d inner join fetch d.emps";
    //  String hql = " from  Department d inner join fetch d.emps";
    Query query = session.createQuery(hql);

    List list = query.list();
    System.out.println(list.size());

}
马哥私房菜博客地址:https://github.com/mageSFC/myblog 

Hibernate:
select
distinct department0_.id as id0_0_,
emps1_.id as id1_1_,
department0_.name as name0_0_,
emps1_.name as name1_1_,
emps1_.salary as salary1_1_,
emps1_.email as email1_1_,
emps1_.dept_id as dept5_1_1_,
emps1_.dept_id as dept5_0_0__,
emps1_.id as id0__
from
hb_department department0_
inner join
hb_employee emps1_
on department0_.id=emps1_.dept_id
11
=destroy=


马哥私房菜博客地址:https://github.com/mageSFC/myblog 

这里反过来,查询employee,先使用 left join fetch
@Test
public void testHQLJoin() {
    String hql = "select e from  Employee e left join fetch e.dept";
    Query query = session.createQuery(hql);

    List<Employee> list = query.list();
    System.out.println(list.size());

    for (Employee employee : list) {
        System.out.println(employee.getName() + ", " + employee.getDept().getName());
        //System.out.println(employee);
    }

}
马哥私房菜博客地址:https://github.com/mageSFC/myblog 

Hibernate:
select
employee0_.id as id1_0_,
department1_.id as id0_1_,
employee0_.name as name1_0_,
employee0_.salary as salary1_0_,
employee0_.email as email1_0_,
employee0_.dept_id as dept5_1_0_,
department1_.name as name0_1_
from
hb_employee employee0_
left outer join
hb_department department1_
on employee0_.dept_id=department1_.id
107
Whalen, Administration
Hartstein, Marketing
Fay, Marketing
Raphaely, Purchasing
Khoo, Purchasing
Baida, Purchasing
Tobias, Purchasing
Himuro, Purchasing
Colmenares, Purchasing
Mavris, Human Resources
Weiss, Shipping
Fripp, Shipping
Kaufling, Shipping
Vollman, Shipping
Mourgos, Shipping
Nayer, Shipping
Mikkilineni, Shipping
Landry, Shipping
Markle, Shipping
Bissot, Shipping
Atkinson, Shipping
Marlow, Shipping
Olson, Shipping
Mallin, Shipping
Rogers, Shipping
Gee, Shipping
Philtanker, Shipping
Ladwig, Shipping
Stiles, Shipping
Seo, Shipping
Patel, Shipping
Rajs, Shipping
Davies, Shipping
Matos, Shipping
Vargas, Shipping
Taylor, Shipping
Fleaur, Shipping
Sullivan, Shipping
Geoni, Shipping
Sarchand, Shipping
Bull, Shipping
Dellinger, Shipping
Cabrio, Shipping
Chung, Shipping
Dilly, Shipping
Gates, Shipping
Perkins, Shipping
Bell, Shipping
Everett, Shipping
McCain, Shipping
Jones, Shipping
Walsh, Shipping
Feeney, Shipping
OConnell, Shipping
Grant, Shipping
Hunold, IT
Ernst, IT
Austin, IT
Pataballa, IT
Lorentz, IT
Baer, Public Relations
Russell, Sales
Partners, Sales
Errazuriz, Sales
Cambrault, Sales
Zlotkey, Sales
Tucker, Sales
Bernstein, Sales
Hall, Sales
Olsen, Sales
Cambrault, Sales
Tuvault, Sales
King, Sales
Sully, Sales
McEwen, Sales
Smith, Sales
Doran, Sales
Sewall, Sales
Vishney, Sales
Greene, Sales
Marvins, Sales
Lee, Sales
Ande, Sales
Banda, Sales
Ozer, Sales
Bloom, Sales
Fox, Sales
Smith, Sales
Bates, Sales
Kumar, Sales
Abel, Sales
Hutton, Sales
Taylor, Sales
Livingston, Sales
Johnson, Sales
King, Executive
Kochhar, Executive
De Haan, Executive
Greenberg, Finance
Faviet, Finance
Chen, Finance
Sciarra, Finance
Urman, Finance
Popp, Finance
Higgins, Accounting
Gietz, Accounting
=destroy=

java.lang.NullPointerException
at com.mamh.hibernate.demo.HibernateHqlTest.testHQLJoin(HibernateHqlTest.java:261)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.junit.runners.model.FrameworkMethod 1.runReflectiveCall(FrameworkMethod.java:50)atorg.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)atorg.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:47)atorg.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)atorg.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)atorg.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27)atorg.junit.runners.ParentRunner.runLeaf(ParentRunner.java:325)atorg.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:78)atorg.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:57)atorg.junit.runners.ParentRunner 3.run(ParentRunner.java:290)
at org.junit.runners.ParentRunner 1.schedule(ParentRunner.java:71)atorg.junit.runners.ParentRunner.runChildren(ParentRunner.java:288)atorg.junit.runners.ParentRunner.access 000(ParentRunner.java:58)
at org.junit.runners.ParentRunner 2.evaluate(ParentRunner.java:268)atorg.junit.runners.ParentRunner.run(ParentRunner.java:363)atorg.junit.runner.JUnitCore.run(JUnitCore.java:137)atcom.intellij.junit4.JUnit4IdeaTestRunner.startRunnerWithArgs(JUnit4IdeaTestRunner.java:68)atcom.intellij.rt.execution.junit.IdeaTestRunner Repeater.startRunnerWithArgs(IdeaTestRunner.java:47)
at com.intellij.rt.execution.junit.JUnitStarter.prepareStreamsAndStart(JUnitStarter.java:242)
at com.intellij.rt.execution.junit.JUnitStarter.main(JUnitStarter.java:70)

这个情况 使用 left join fetch 会出个异常,因为有的人没有部门,就是department是null。
下面改为inner就不会报错了

马哥私房菜博客地址:https://github.com/mageSFC/myblog 
下面改为inner就不会报错了
@Test
public void testHQLJoin() {
    String hql = "select e from  Employee e inner join fetch e.dept";
    Query query = session.createQuery(hql);

    List<Employee> list = query.list();
    System.out.println(list.size());

    for (Employee employee : list) {
        System.out.println(employee.getName() + ", " + employee.getDept().getName());
    }

}
马哥私房菜博客地址:https://github.com/mageSFC/myblog 

Hibernate:
select
employee0_.id as id1_0_,
department1_.id as id0_1_,
employee0_.name as name1_0_,
employee0_.salary as salary1_0_,
employee0_.email as email1_0_,
employee0_.dept_id as dept5_1_0_,
department1_.name as name0_1_
from
hb_employee employee0_
inner join
hb_department department1_
on employee0_.dept_id=department1_.id
106
King, Executive
Kochhar, Executive
De Haan, Executive
Hunold, IT
Ernst, IT
Austin, IT
Pataballa, IT
Lorentz, IT
Greenberg, Finance
Faviet, Finance
Chen, Finance
Sciarra, Finance
Urman, Finance
Popp, Finance
Raphaely, Purchasing
Khoo, Purchasing
Baida, Purchasing
Tobias, Purchasing
Himuro, Purchasing
Colmenares, Purchasing
Weiss, Shipping
Fripp, Shipping
Kaufling, Shipping
Vollman, Shipping
Mourgos, Shipping
Nayer, Shipping
Mikkilineni, Shipping
Landry, Shipping
Markle, Shipping
Bissot, Shipping
Atkinson, Shipping
Marlow, Shipping
Olson, Shipping
Mallin, Shipping
Rogers, Shipping
Gee, Shipping
Philtanker, Shipping
Ladwig, Shipping
Stiles, Shipping
Seo, Shipping
Patel, Shipping
Rajs, Shipping
Davies, Shipping
Matos, Shipping
Vargas, Shipping
Russell, Sales
Partners, Sales
Errazuriz, Sales
Cambrault, Sales
Zlotkey, Sales
Tucker, Sales
Bernstein, Sales
Hall, Sales
Olsen, Sales
Cambrault, Sales
Tuvault, Sales
King, Sales
Sully, Sales
McEwen, Sales
Smith, Sales
Doran, Sales
Sewall, Sales
Vishney, Sales
Greene, Sales
Marvins, Sales
Lee, Sales
Ande, Sales
Banda, Sales
Ozer, Sales
Bloom, Sales
Fox, Sales
Smith, Sales
Bates, Sales
Kumar, Sales
Abel, Sales
Hutton, Sales
Taylor, Sales
Livingston, Sales
Johnson, Sales
Taylor, Shipping
Fleaur, Shipping
Sullivan, Shipping
Geoni, Shipping
Sarchand, Shipping
Bull, Shipping
Dellinger, Shipping
Cabrio, Shipping
Chung, Shipping
Dilly, Shipping
Gates, Shipping
Perkins, Shipping
Bell, Shipping
Everett, Shipping
McCain, Shipping
Jones, Shipping
Walsh, Shipping
Feeney, Shipping
OConnell, Shipping
Grant, Shipping
Whalen, Administration
Hartstein, Marketing
Fay, Marketing
Mavris, Human Resources
Baer, Public Relations
Higgins, Accounting
Gietz, Accounting
=destroy=

Process finished with exit code 0

马哥私房菜博客地址:https://github.com/mageSFC/myblog 
如果使用inner join呢?就是不加fetch情况。是不会初始化集合的,
@Test
public void testHQLJoin() {
    String hql = "select e from  Employee e inner join  e.dept";
    Query query = session.createQuery(hql);

    List<Employee> list = query.list();
    System.out.println(list.size());

    for (Employee employee : list) {
        System.out.println(employee.getName() + ", " + employee.getDept().getName());
    }

}
马哥私房菜博客地址:https://github.com/mageSFC/myblog 

Hibernate:
select
employee0_.id as id1_,
employee0_.name as name1_,
employee0_.salary as salary1_,
employee0_.email as email1_,
employee0_.dept_id as dept5_1_
from
hb_employee employee0_
inner join
hb_department department1_
on employee0_.dept_id=department1_.id
106
Hibernate:
select
department0_.id as id0_0_,
department0_.name as name0_0_
from
hb_department department0_
where
department0_.id=?
King, Executive
Kochhar, Executive
De Haan, Executive
Hibernate:
select
department0_.id as id0_0_,
department0_.name as name0_0_
from
hb_department department0_
where
department0_.id=?
Hunold, IT
Ernst, IT
Austin, IT
Pataballa, IT
Lorentz, IT
Hibernate:
select
department0_.id as id0_0_,
department0_.name as name0_0_
from
hb_department department0_
where
department0_.id=?
Greenberg, Finance
Faviet, Finance
Chen, Finance
Sciarra, Finance
Urman, Finance
Popp, Finance
Hibernate:
select
department0_.id as id0_0_,
department0_.name as name0_0_
from
hb_department department0_
where
department0_.id=?
Raphaely, Purchasing
Khoo, Purchasing
Baida, Purchasing
Tobias, Purchasing
Himuro, Purchasing
Colmenares, Purchasing
Hibernate:
select
department0_.id as id0_0_,
department0_.name as name0_0_
from
hb_department department0_
where
department0_.id=?
Weiss, Shipping
Fripp, Shipping
Kaufling, Shipping
Vollman, Shipping
Mourgos, Shipping
Nayer, Shipping
Mikkilineni, Shipping
Landry, Shipping
Markle, Shipping
Bissot, Shipping
Atkinson, Shipping
Marlow, Shipping
Olson, Shipping
Mallin, Shipping
Rogers, Shipping
Gee, Shipping
Philtanker, Shipping
Ladwig, Shipping
Stiles, Shipping
Seo, Shipping
Patel, Shipping
Rajs, Shipping
Davies, Shipping
Matos, Shipping
Vargas, Shipping
Hibernate:
select
department0_.id as id0_0_,
department0_.name as name0_0_
from
hb_department department0_
where
department0_.id=?
Russell, Sales
Partners, Sales
Errazuriz, Sales
Cambrault, Sales
Zlotkey, Sales
Tucker, Sales
Bernstein, Sales
Hall, Sales
Olsen, Sales
Cambrault, Sales
Tuvault, Sales
King, Sales
Sully, Sales
McEwen, Sales
Smith, Sales
Doran, Sales
Sewall, Sales
Vishney, Sales
Greene, Sales
Marvins, Sales
Lee, Sales
Ande, Sales
Banda, Sales
Ozer, Sales
Bloom, Sales
Fox, Sales
Smith, Sales
Bates, Sales
Kumar, Sales
Abel, Sales
Hutton, Sales
Taylor, Sales
Livingston, Sales
Johnson, Sales
Taylor, Shipping
Fleaur, Shipping
Sullivan, Shipping
Geoni, Shipping
Sarchand, Shipping
Bull, Shipping
Dellinger, Shipping
Cabrio, Shipping
Chung, Shipping
Dilly, Shipping
Gates, Shipping
Perkins, Shipping
Bell, Shipping
Everett, Shipping
McCain, Shipping
Jones, Shipping
Walsh, Shipping
Feeney, Shipping
OConnell, Shipping
Grant, Shipping
Hibernate:
select
department0_.id as id0_0_,
department0_.name as name0_0_
from
hb_department department0_
where
department0_.id=?
Whalen, Administration
Hibernate:
select
department0_.id as id0_0_,
department0_.name as name0_0_
from
hb_department department0_
where
department0_.id=?
Hartstein, Marketing
Fay, Marketing
Hibernate:
select
department0_.id as id0_0_,
department0_.name as name0_0_
from
hb_department department0_
where
department0_.id=?
Mavris, Human Resources
Hibernate:
select
department0_.id as id0_0_,
department0_.name as name0_0_
from
hb_department department0_
where
department0_.id=?
Baer, Public Relations
Hibernate:
select
department0_.id as id0_0_,
department0_.name as name0_0_
from
hb_department department0_
where
department0_.id=?
Higgins, Accounting
Gietz, Accounting
=destroy=

我们可以看到 其中连续打印,并且还有查询department的操作。还能发现如果查过某个department的话下面一条就不会重复查询。
“`
马哥私房菜博客地址:https://github.com/mageSFC/myblog


评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值