ORACLE execute error: ORA-01950: no privileges on tablespace ‘PDATA’

I’m new in Oracle, so maybe my question could be stupid. I use Oracle only for data storage. I have made some research but I’m blocked. I use Oracle 12c. I created a PDB with admin user PEEI_SYS like this:

create pluggable database PEEI admin user PEEI_SYS identified by PEEI 
roles = (DBA);
-- open PDB PEEI 
alter pluggable database PEEI open read write;`

I have created another user called PEEI which should only do select, update, insert on tables owned by PEEI_SYS. I have created the user PEEI like this:

CREATE USER "PEEI" IDENTIFIED BY "PEEI" DEFAULT TABLESPACE PDATA TEMPORARY TABLESPACE TEMP PROFILE DEFAULT ACCOUNT UNLOCK;`
Now I would like that the user PEEI could insert rows in the table PEEI_SYS.PEEI_P_TRACKING. This table is created like this:
 `CREATE TABLE PEEI_SYS.PEEI_P_TRACKING
( 
"CODE_WORKFLOW" VARCHAR2(30 BYTE), 
"STATUS" VARCHAR2(15 BYTE), 
"DATE_UPDATE" DATE, 
"USER_UPDATE" VARCHAR2(20 BYTE), 
"DEB_WORKFLOW" DATE, 
"FIN_WORKFLOW" DATE, 
"TIME_SECOND" NUMBER
) PCTFREE 10 PCTUSED 40 INITRANS 1 MAXTRANS 255 NOCOMPRESS LOGGING
STORAGE(INITIAL 65536 NEXT 1048576 MINEXTENTS 1 MAXEXTENTS 2147483645
PCTINCREASE 0 FREELISTS 1 FREELIST GROUPS 1 BUFFER_POOL DEFAULT)
TABLESPACE "PDATA" ;
GRANT SELECT ON PEEI_SYS.PEEI_P_TRACKING TO ROLE_PEEI_READ;
GRANT DELETE ON PEEI_SYS.PEEI_P_TRACKING TO ROLE_PEEI_WRITE;
GRANT INSERT ON PEEI_SYS.PEEI_P_TRACKING TO ROLE_PEEI_WRITE;
GRANT UPDATE ON PEEI_SYS.PEEI_P_TRACKING TO ROLE_PEEI_WRITE;

When I got the error I granted unlimited privileges to PEEI user on PDATA tablespaces like this: ALTER USER PEEI QUOTA UNLIMITED ON PDATA;
I have still the error. Could you please help me ?
Thank you very much in advance.
Kind regards,
enter image description here

ambiguous “invalid operation” while importing during Oracle datapump

Let me summarize the problem first and I’ll give details of the SQL I used to get where I’m at after the summary.

I’m exporting a schema from a production AWS RDS Oracle instance, using a database link to download the file to my local development database, then running an import locally on an empty database of a freshly installed Oracle in a Docker container. The export and import use Datapump. I get a very ambiguous error message “invalid operation” with equally ambiguous details suggesting I call “DBMS_DATAPUMP.GET_STATUS” to “further describe the error”. When I do, I get exactly the same ambiguous “invalid operation” with a suggestion to call “GET_STATUS” to further describe the error.

I’m at a loss of where to even begin in diagnosing and solving this problem.

Here are the detailed steps I took. I have substituted our schema name with “MY_SCHEMA” to protect the identity of our client… and if there’s any mismatch in that text, I assure you it is correct in my console and just a mistake in the substitution for this question. I used SQLDeveloper to run these commands.

  1. On the AWS RDS Oracle instance running 19g
    DECLARE
    hdnl NUMBER;
    BEGIN
    hdnl := DBMS_DATAPUMP.OPEN( operation => 'EXPORT', job_mode => 'SCHEMA', job_name=>null);
    DBMS_DATAPUMP.ADD_FILE( handle => hdnl, filename => 'my_schema.dmp', directory => 'DATA_PUMP_DIR', filetype => dbms_datapump.ku$_file_type_dump_file);
    DBMS_DATAPUMP.ADD_FILE( handle => hdnl, filename => 'my_schema.log', directory => 'DATA_PUMP_DIR', filetype => dbms_datapump.ku$_file_type_log_file);
    DBMS_DATAPUMP.METADATA_FILTER(hdnl,'SCHEMA_EXPR','IN (''MY_SCHEMA'')');
    DBMS_DATAPUMP.START_JOB(hdnl);
    END;
    /
  1. Connect from my local dev database to the AWS RDS instance and download the dmp file. And yes, here I connect as the schema owner and not “master”. This seems to work to download the file and connecting as “master” does not, where dumping as the schema owner doesn’t work in step one; unless you can instruct me how to do that and if that would solve my problem.
    create database link to_rds connect to my_schema identified by password using '(DESCRIPTION=(ADDRESS=(PROTOCOL=TCP)(HOST=my_schema.aljfjske.us-west-1.rds.amazonaws.com)(PORT=1521))(CONNECT_DATA=(SID=ORCL)))';
    
    BEGIN
    DBMS_FILE_TRANSFER.GET_FILE(
    source_directory_object       => 'DATA_PUMP_DIR',
    source_file_name              => 'my_schema.dmp',
    source_database               => 'to_rds',
    destination_directory_object  => 'DATA_PUMP_DIR',
    destination_file_name         => 'my_schema.dmp'
    );
    END;
    /
  1. Start the import while logged in as “sys” with role “sysdba” on my local database (connected to the pluggable database called “my_schema”).
    DECLARE
    hdnl NUMBER;
    BEGIN
    hdnl := DBMS_DATAPUMP.OPEN( operation => 'IMPORT', job_mode => 'SCHEMA', job_name=>null);
    DBMS_DATAPUMP.ADD_FILE( handle => hdnl, filename => 'my_schema.dmp', directory => 'DATA_PUMP_DIR');
    DBMS_DATAPUMP.METADATA_FILTER(hdnl,'SCHEMA_EXPR','IN (''MY_SCHEMA'')');
    DBMS_DATAPUMP.START_JOB(hdnl);
    end;
    /

And I get the following error:

DECLARE
hdnl NUMBER;
BEGIN
hdnl := DBMS_DATAPUMP.OPEN( operation => 'IMPORT', job_mode => 'SCHEMA', job_name=>null);
DBMS_DATAPUMP.ADD_FILE( handle => hdnl, filename => 'my_schema.dmp', directory => 'DATA_PUMP_DIR');
DBMS_DATAPUMP.METADATA_FILTER(hdnl,'SCHEMA_EXPR','IN (''MY_SCHEMA'')');
DBMS_DATAPUMP.START_JOB(hdnl);
end;
Error report -
ORA-39002: invalid operation
ORA-06512: at "SYS.DBMS_DATAPUMP", line 7297
ORA-06512: at "SYS.DBMS_SYS_ERROR", line 79
ORA-06512: at "SYS.DBMS_DATAPUMP", line 4932
ORA-06512: at "SYS.DBMS_DATAPUMP", line 7291
ORA-06512: at line 7
39002. 00000 -  "invalid operation"
*Cause:    The current API cannot be executed because of inconsistencies
           between the API and the current definition of the job.
           Subsequent messages supplied by DBMS_DATAPUMP.GET_STATUS
           will further describe the error.
*Action:   Modify the API call to be consistent with the current job or
           redefine the job in a manner that will support the specified API.

I’ve spent 6+ hrs working on this already reading Oracle docs, guides, trying things, printing more information to the console, and nothing. I get exactly the same error message with no more information. The dump file is on the system and I’m pretty sure it’s being read properly because I can call utl_file.fgetattr to get its size. I’ve also tried exporting and importing with different users. nothing. I’m totally in the dark here. Even suggestions on what to try to diagnose this would be much appreciated. This is a fresh install of Oracle Database 18g Express Edition using Oracle’s Docker container files on their GitHub account (which is pretty slick, BTW). The production system on RDS has been up for several years and I’ve exported Datapump dozens of times during those years and successfully imported it into my local 11g Express Edition installation on Fedora Linux. (Which no longer works since the production database was upgraded from 12g to 19g recently. That started me on this whole path.)

Setting up an Oracle linked server using an LDAP connection string

I am trying to set up an Oracle Linked Server in SQL Server 2016. With the information I have for the Oracle server, I am able to reach it with tnsping using a connect string via LDAP.

tnsping Result

Adding the string that works with tnsping as the Provider string in linked server setup, however, doesn’t work and results in an “ORA-12154: TNS:could not resolve the connect identifier specified” error.

enter image description here

I’m sure it’s something I am missing, but don’t know enough about Oracle to know what that might be.

oracle – Restore non-cdb backup into pdb database

I haven’t found any information on is it possible to restore a Oracle non-cdb 19.8 backup into a container-database PDB (still 19c) , or not.
The method i have found is not suitable for my situation (convert non-cdb to pdb using DBMS_PDB.DESCRIBE – that seems to imply that the database type is converted + all data is copied during this operation, which is not the same as restoring from a backup).
*Export/import/Golden Gate are not viable options for me.

Has anyone had a similar challenge and what was the best way to get a non-cdb to get into pdb form.

upgrade – How to change password version by myself in oracle?

Oracle documentation (18c) says:

….

You must expire the users who have only the 10G password version, and
do not have one or both of the 11G or 12C password versions.

For example:

ALTER USER username PASSWORD EXPIRE;

Ask the users whose passwords you expired to log in. When the users
log in, they are prompted to change their passwords. The database
generates the missing 11G and 12C password versions for their account,
in addition to the 10G password version. The 10G password version
continues to be present, because the database is running in the
permissive mode.

But, there is some way to do it by myself? (and to use the same old password)

oracle – ODI remove auto lock when object / datastore is opened

I was being stupid and enabled automatically lock object during edit / opened when I was asked the first time.

Now I want to remove the automatically lock from GUI but I couldn’t seem to find it.
I googled with such topics

https://www.appservgrid.com/documentation111/docs/fmw11g1114documentation/integrate.1111/e12643/appendix_b.htm

https://odielt.wordpress.com/tag/odi-studio-settings/

So I searched userpref.xml inside the hard drive, I do find one userpref.xml.

I opened the file and inside it’s just

<?xml version="1.0" encoding="UTF-8" ?>
<UserPreferences xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
                 xsi:schemaLocation="http://www.oracle.com/odi/userprefs UserPref.xsd"
                 xmlns="http://www.oracle.com/odi/userprefs">
  <Preference>
    <AcessibilitySetting>
      <Screenreader>false</Screenreader>
      <HighContrast>false</HighContrast>
      <LargeFonts>false</LargeFonts>
    </AcessibilitySetting>
    <UserName>None</UserName>
    <Repository>
      <Name>None</Name>
      <GUID>None</GUID>
    </Repository>
  </Preference>
</UserPreferences>

What can I do to make it not locking up anymore each time I open an object? If it can be done through GUI then it’s perfect if not, what should I do inside the XML file?

Thanks in advance for any help and advices.

log – Oracle 19c with SAP System – ORA-12154: TNS:could not resolve the connect identifier specified After Configuring Database Vault

I configured database vault and create policy on Oracle 19c + SAP system. After this operation, on primary side I have a problem when transporting logs to Data Guard side.

Outputs from primary side:

SQL> select status, error from v$archive_dest where dest_id=2;                                                                                                                                      
                                                                                                                                                                                                    
STATUS    ERROR                                                                                                                                                                                     
--------- -----------------------------------------------------------------                                                                                                                         
ERROR     ORA-12154: TNS:could not resolve the connect identifier specified


SQL> show parameter log_archive_dest_2

NAME                                 TYPE        VALUE
------------------------------------ ----------- ------------------------------
log_archive_dest_2                   string      SERVICE=DRSIDE NOAFFIRM ASYNC V
                                                 ALID_FOR=(ONLINE_LOGFILES,PRIM
                                                 ARY_ROLE) DB_UNIQUE_NAME=DRSIDE

SQL> show parameter log_archive_dest_state_2                                                                                                                                                        
                                                                                                                                                                                                    
NAME                                 TYPE        VALUE                                                                                                                                              
------------------------------------ ----------- ------------------------------                                                                                                                     
log_archive_dest_state_2             string      ENABLE

Alert log:

TT00 (PID: a number): Error 12154 received logging on to the standby
2021-02-20T15:48:51.144500+03:00
TT00 (PID: a number): Error 12154 received logging on to the standby
2021-02-20T15:53:51.946941+03:00
TT00 (PID: a number): Error 12154 received logging on to the standby
2021-02-20T15:58:52.744327+03:00
TT00 (PID: a number): Error 12154 received logging on to the standby
2021-02-20T16:03:53.549161+03:00
TT00 (PID: a number): Error 12154 received logging on to the standby

I did not make any changes to the tnsnames.ora or listener.ora files.

Best regards,

oracle – Select all rows that have all of the following column values

I have the following tables Genres, Films, Directors. They have the following schema:

CREATE TABLE GENRES(
    GID INTEGER PRIMARY KEY,
    GENRE VARCHAR(20) UNIQUE NOT NULL
);

CREATE TABLE Films(
    FID INTEGER PRIMARY KEY,
    Title VARCHAR(45) UNIQUE NOT NULL,
    DID INTEGER NOT NULL,
    GID INTEGER NOT NULL,
    FOREIGN KEY (DID) REFERENCES Directors(DID),
    FOREIGN KEY (GID) REFERENCES Genres(DID)
);

CREATE TABLE Directors(
    DID INTEGER PRIMARY KEY,
    First_Name VARCHAR(20) NOT NULL,
    Last_Name VARCHAR(20) NOT NULL
);

I want to write a query that will allow me to select all of Director information for every director that has made atleast one movie in the same genre(s) as another director. For example if Stanley Kubrick has made films in genres ‘Sci-Fi’, ‘Thriller’, and ‘Crime’, I want to select all the directors who have made at least 1 sci-fi AND 1 thriller AND 1 crime film.

I’ve tried the query seen below but this will give me directors who have made atleast 1 sci-fi OR 1 thriller OR 1 crime film.

SELECT DISTINCT D.DID, D.First_Name, D.Last_Name
FROM Directors D
LEFT JOIN Films F
ON F.DID = D.DID
LEFT JOIN Genres G
ON G.GID = B.GID
WHERE G.Genre IN (
  SELECT DISTINCT G1.Genre
  FROM Generes G1
  LEFT JOIN Films F1
  ON F1.GID = G1.GID
  LEFT JOIN Directors D1
  ON D1.DID = D1.DID
  WHERE D1.First_Name = 'Stanley'
  AND D1.Last_Name = 'Kubrick'
);

Additionally, I am not able to check before hand which Genres the director in question has been involved with. The query should work with the only given information being the Directors First and Last name.

oracle – El instalador que he hecho de mi aplicación en java no se conecta a la base de datos

les cuento.
Hice una aplicación en java netbeans, es un pequeño crud conectado a la base de datos oracle, el caso es que hice el archivo.exe en launch4j y el instalador en inno setup compiler, en este mismo agregue la carpeta dist en donde se encuentras las librerias, sin embargo una vez instalo el programa y lo ejecuto no se conecta con la base de datos, a que se puede deber esto?

Restart ORACLE OHAS without restarting databases

looks like ohasd is getting to much memory (memory leak or something) that cause heavy swapping, is it possible to restart OHAS (till patch is applied) without bouncing databases?

"crsctl stop has" puts all resources down 🙁