# JS.OracleDatabase: I got it, now how do I use it?

I want to tell you that if you have made it here so far, you are on my team. The unsettled ones that always look for improvement.

I’ll be posting some scripts and samples on my PowerShell GitHub Repo. I’m quite sure you’ll find some interesting ideas there. Feel free to contact me about modifications you’d like or better ways to implement and/or extend the functions I have already in the Module.

Here’s a simple script to get the size of an Oracle cluster database.

# Get Formatted Sizing with % Used
Param(
[Parameter(Mandatory = $true)] [String[]]$TargetDB,
[Parameter(Mandatory = $true)] [ValidateSet("Full","Storage","Tablespace","Table")] [String]$SizeType,
[Parameter(Mandatory = $true)] [ValidateSet("B","KB","MB","GB","TB","PB","EB","ZB")] [String]$Unit
)
foreach ($DBName in$TargetDB) {
Get-OracleSize -TargetDB $DBName -SizeType$SizeType -Unit $Unit | Format-Table -AutoSize  @{Name = "Database";Expression = {$_.DBName }; Alignment = "left"},
@{Name = "DiskGroup";Expression = { $_.DiskGroup }; Alignment = "left"}, @{Name = "UsedSpace";Expression = {$_.UsedGB }; Alignment = "Right"},
@{Name = "AllocatedSpace";Expression = { $_.AllocatedGB }; Alignment = "Right"}, @{Name = "MaxSpace";Expression = { "{0:N2}" -f$_.MaxGB }; Alignment = "Right"},
$env:PATH = "$env:PATH ; $env:ORACLE_HOME/bin" This module can be used under Windows or, if your using PowerShell Core, on Linux. Happy scripting! # Getting Comfortable with PowerShell I’m starting my journey with PowerShell. It seemed a rather interesting tool as presented on the MVA courses: So, after the jump start, I decided to go for it and give it a good try for something else I’m passionate about besides scripting and automation. The Oracle Database. That’s how my first PowerShell project was born. But first, get comfortable with the ISE. Yes, you may want to use the PowerShell Integrated Scripting Environment if you, like me, are giving your first steps into PowerShell scripting. Now, of course you won’t work directly on the path of the modules that are loaded when PowerShell is initiated. Thus, is a good idea to add a persistent path to the$env:PSModulePath environment variable, to do so, choose your path and add it as follows.

$CurrentValue = [Environment]::GetEnvironmentVariable("PSModulePath", "Machine") [Environment]::SetEnvironmentVariable("PSModulePath",$CurrentValue + ";Drive:\Path\to\where\you\will\work\on\Modules", "Machine")

You can set this configuration for your local account only if you’re on a multi-user computer. Change the “Machine” scope to “User” scope as explained on this MSDN Article.

Now let’s get a better Prompt. Mine is like this, minimalist:

function Prompt {
$CurrentLocation=$($(Get-Location).ToString() -split '\\')[-1] "[$env:COMPUTERNAME] $CurrentLocation > " } This function is what you can modify to change the appearance and functionality of your prompt. Now, you can get tired of getting this to load every time you run the PowerShell ISE, so you may want to automate this. Look for the$profile variable. Create a new file with that path and paste whatever you want done during the ISE startup.

[SOMEHOSTNAME] powershell > $profile C:\Users\<Your-Username>\Documents\WindowsPowerShell\Microsoft.PowerShellISE_profile.ps1 [SOMEHOSTNAME] powershell > if (-not$profile) { New-Item -Path $profile -ItemType File } [SOMEHOSTNAME] powershell > psEdit$profile

Then paste something like this, substituting the paths for something that makes sense on your system

# Work or Autoload directory
$psScriptsDir="Drive:\Path\to\powershell\scripts" # One line to load them all Get-ChildItem -Path$psScriptsDir\Prompt.ps1 | %{.$_} #Then change your location to your work directory Set-Location Drive:\Path\to\powershell\work Now you’re all set. I’ll leave you with your first 5 commands! 1. Update-Help: make sure you’re online. This will pull the latest docs for whatever you have installed and configured on your PS installation 2. Get-Help: This will give you the initial step on how to use the help 3. Find-Module: This command let’s you look for “keywords” on the PS Module Gallery. Try Find-Module *git* as an example. 4. Save-Module: This can download the module and let you examine its contents. You can manually load the module. 5. Install-Module: This command installs the module on your PS installation and makes it persistent, so it will be available every time you start PS Happy scripting! # How to ‘find’ with argument list too long Hello dear readers, Today I’m going to share a very short, but (I hope) useful piece of information for some Linux admins and DBAs as well. The find tool is very good in isolating files that match a certain pattern. I use it, in particular, for cleaning up old logs and files that are no longer required in file system if you generate 5 to 10 logs a day. You could even do this manually, but when you generate 5 to 10 logs per minute, you need to automate this as much as you can. So piping the following commands together do the job nicely find . -name *xmin*.log -mtime +30 | xargs -r rm -f  Or so I thought. Lately I have been receiving the following error: /usr/bin/find: Argument list too long A shame… Cleanup is not working anymore and I have to go over it manually. But manually doesn’t work either as the number of files is huge. Turns out that the error pops up at substitution time because of the pattern *xmin*.log. It doesn’t act as a pattern at all but as a list of all the files that match that pattern. So, if you want it to work as expected, just enclose it in single quotes as follows: find . -name '*xmin*.log' -mtime +30 | xargs -r rm -f And that’s it… You’re back in the automated game! # How to avoid plain-text passwords in shell scripts ###### Hello fellow DBAs and readers, today I want to share with you a good option to avoid having plain-text password on storage as part of scripts or automation efforts. A few months ago, I discovered an issue with a plain-text password sitting there in a script. Most of the people I talked about it disregarded the issue as “no big deal”. But my concern only grew when I found out that it was a common practice to script like that and I decided to do something about it. So there are a lot of options out there, if you ask me. You can even have your own code there written in a compiled code. But truth is that given enough time, if a hacker gets that code, they can reverse engineer it to find out what you’re using. Using Python or any other scripting language is even simpler to read if you find the code and thus, is not a good option to have our “encryption method” in the same place as the passwords. At some point we stumbled upon a good solution provided by Oracle itself: The Oracle Wallet. If you didn’t know it. You can use it to store external passwords and access them transparently to connect to any Oracle Database. You can find the whitepaper by clicking >> here << According to the whitepaper you can use the the tnsnames.ora file that is located at your$ORACLE_HOME/network/admin directory. But for my use case, I needed to keep that one as clean as possible and use a different tnsnames.ora file.

So I have this special directory where I will have the tnsnames.ora and sqlnet.ora files and the wallet directory. Let’s call it /path/to/tns/files I’ll make my Oracle binaries aware of this location by exporting the TNS_ADMIN environment variable.

Let’s make a long story short and get this done:

mkdir -p /path/to/tns/files/wallet
2. Create tnsnames.ora and sqlnet.ora files
cd /path/to/tns/files
vi tnsnames.ora
vi sqlnet.ora
• For the sqlnet.ora file use the following content:
WALLET_LOCATION =
(SOURCE =
(METHOD = FILE)
(METHOD_DATA =
(DIRECTORY = /path/to/tns/files/wallet)))
SQLNET.WALLET_OVERRIDE = TRUE
SSL_CLIENT_AUTHENTICATION = FALSE
3. Create the wallet files
mkstore -wrl /path/to/tns/files/wallet -create
• Now you’ll see the cwallet.sso and the ewallet.p12 files sitting there.
4. It’s time to start saving your credentials
mkstore -wrl /path/to/tns/files/wallet -createCredential <DB NAME> <USERNAME> [<PASSWORD>]`

You can see I place the password as optional. As you may have guessed already, I don’t want to leave the password there in the command history of my SHELL.

So, finally you can test your connection using sqlplus /@<DB NAME>